It is no secret that privacy is an important part of our everyday lives. In the physical world we can shut the door to our houses keeping outside influences out. Additionally, we are able to access information from different backgrounds, beliefs, and opinions. We create what we want and share it how we want. We are seen for our skills and knowledge, not for our zip codes and “probability of success.” Within schools, our tools were only constricted by the rules of the teacher. However, now in the digital age, our privacy is encroached on even when the doors are closed through data. Our behaviors, learning experiences, and access to information are being recorded and constricted based on what an algorithm thinks is best for us.
I have read “MESSY & CHAOTIC LEARNING: A DOMAINS PRESENTATION AT KEENE STATE COLLEGE” by Martha Fay Burtis. I have also read “Understanding and Maintaining Your Privacy When Writing with Digital Technologies” by Lindsey C. Kim. After reading both the Issue with privacy of user’s Data and how the internet constricts its users cannot be ignored.
Privacy and Data
Individual Privacy has always been an issue even before the modern age of technology. It is in our own space where our ideas can flourish without worry. However, in the case of space online, it’s not our physical privacy that is in jeopardy but our information privacy. This information can contain things such as where we went to school, where we work, and where we live. But it can be even more invasive, such as our political views, search history, and what apps we use. Our private searches online turn into data.
As Kim puts it, “Networked activity requires near constant connection between users, platforms, services, and devices, so the concept of basing privacy purely on limiting access to our devices just does not work.”
Data in Education
To companies, our data is a valuable aspect as its users are just customers at the end of the day. By learning what customers’ interests are, the better they can sell you a product. To perfect this, algorithms are used to personalize a user’s online experience. Thus, giving more information to the companies who designed these algorithms. Burtis gives an example of this about educational companies such as Blackboard, WebCT, and Angel. Where they offer their services as a “product” for schools to buy into.
As a result, Information of students, teachers, and staff alike are recorded. Information on the student’s grades, standardized test scores, and work can then be given to any actor that has access to it within these companies. This data can then but put into algorithms for each student, thus making their educational experience predictable and expected.
If a student or faculty member has an issue with how data is recorded and how it can breach privacy, it’s tough luck from there on out as all they can do is complain and hope that an update will address their complaints.
The issue of Privacy and data is not only seen in education but in practically every aspect of the digital realm. When a user clicks on a site it asks for them to accept its “cookies” otherwise the site won’t work or is very difficult. When using social media such as YouTube, Instagram, TikTok, Facebook, and Snapchat, the first thing it will ask is to allow access to their data. This data is then passed around, sold, and exchanged to sell the user a product. Whatever that product may be is found out by an algorithm that observes what the user likes, spends the most time looking at, and what keeps them engaged on the app.
I have had experience in this matter when using YouTube and Instagram. For instance, I was looking into how to start a sticker-making business on YouTube. I have put in watch time and research into making this a reality. However, in this sticker-making business, you would need a printer that is made specifically for stickers. Lo and behold, after some time on Instagram I would receive ads on “perfect and cheap” sticker printers for starting a sticker business. Thus, demonstrating how privacy online isn’t inherently private and can be used as data for the companies.
Privacy and data are not the only things both Burtis and Kim bring up in their articles. Another aspect within the digital realm is the constriction it has on its users. For instance, in Burtis’ article on the realm of education, companies such as Blackboard have made it so its users have to abide by their own rules followed at all times. Thus, this leads to students’ experience using the platform to be predictable and uninteresting. A cookie-cutter experience for all students. This not only limits students on what they can do but also teachers if they are forced to use it and grade by it. In education, the experience the students have should be unique to them and their abilities as not everyone learns the same.
This constriction can also be seen in other parts of the internet like social media. Algorithms help companies learn more about their users. It is also used to personalize a user’s experience to keep them engaged in platforms they are on. This can be pushing posts the user likes and slowly pushing away things the user does not like to keep their attention. This can create a “filter bubble” as Kim puts it, where users get stuck in their echo chamber. This furthers their ideas, beliefs, and likes, without much engagement outside of their bubble.
Overall, Both Burtis and Kim have important issues with digital space and how it can affect its users. Thus, it is important to understand one’s privacy with their data. It is also important to acknowledge the constrictive effect the internet has on its users keeping them in their bubble without outside challenges to their ideas, beliefs, and likes.