Tag Archives: Privacy

University of California Berkeley study: Artificial intelligence advances threaten privacy of health data

6 Jan

Joseph Jerome, CIPP/US wrote in the 2016 article, Why artificial intelligence may be the next big privacy trend:

What that looks like will vary, but it is likely that the same far-reaching and broad worries about fairness and accountability that have dogged every discussion about big data — and informed the FTC’s January Big Data Report — will present serious concerns for certain applications of AI. While “Preparing for the Future of Artificial Intelligence” is largely an exercise in stage-setting, the report is likely a harbinger of the same type of attention and focus that emerged within the advocacy community in the wake of the White House’s 2014 Big Data Report. For the privacy profession, the report hints at a few areas where our attention ought to be directed.
First, AI is still a nascent, immature field of engineering, and promoting that maturation process will involve a variety of different training and capacity-building efforts. The report explicitly recommends that ethical training, as well as training in security, privacy, and safety, should become an integral part of the curricula on AI, machine learning, and computer and data science at universities. Moving forward, one could imagine that ethical and other non-technical training will also be an important component of our STEM policies at large. Beyond formal education, however, building awareness among actual AI practitioners and developers will be essential to mitigate disconcerting or unintended behaviors, and to bolster public confidence in the application of artificial intelligence. Policymakers, federal agencies and civil society will need more in-house technical expertise to become more conversant on the current capabilities of artificial intelligence.
Second, while transparency is generally trotted out as the best of disinfectants, balancing transparency in the realm of AI will be a tremendous challenge for both competitive reasons and the “black box” nature of what we’re dealing with. While the majority of basic AI research is currently conducted by academics and commercial labs that collaborate to announce and publish their findings, the report ominously notes that competitive instincts could drive commercial labs towards increased secrecy, inhibiting the ability to monitor the progress of AI development and raising public concerns. But even if we can continue to promote transparency in the development of AI, it may be difficult for anyone whether they be auditors, consumers, or regulators to understand, predict, or explain the behaviors of more sophisticated AI systems.
But even if we can continue to promote transparency in the development of AI, it may be difficult for anyone whether they be auditors, consumers, or regulators to understand, predict, or explain the behaviors of more sophisticated AI systems.
The alternative appears to be bolstering accountability frameworks, but what exactly that looks like in this context is anyone’s guess. The report largely places its hopes on finding technical solutions to address accountability with respect to AI, and an IEEE effort on autonomous systems that I’ve been involved with has faced a similar roadblock. But if we have to rely on technical tools to put good intentions into practice, we will need more discussion about what those tools will be and how industry and individuals alike will be able to use them.
The Sky(net) isn’t falling, but…                                                                https://iapp.org/news/a/why-artificial-intelligence-may-be-the-next-big-privacy-trend/

A University of California Berkeley study reported there could be problem with the use of AI and privacy issues in health data.

Science Daily reported in Artificial intelligence advances threaten privacy of health data:

Led by UC Berkeley engineer Anil Aswani, the study suggests current laws and regulations are nowhere near sufficient to keep an individual’s health status private in the face of AI development. The research was published Dec. 21 in the JAMA Network Open journal.
The findings show that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data, such as that collected by activity trackers, smartwatches and smartphones, and correlating it to demographic data.
The mining of two years’ worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked.
“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” said Aswani. “The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”
“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he added. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”
According to Aswani, the problem isn’t with the devices, but with how the information the devices capture can be misused and potentially sold on the open market.
“I’m not saying we should abandon these devices,” he said. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”
Though the study specifically looked at step data, the results suggest a broader threat to the privacy of health data…. https://www.sciencedaily.com/releases/2019/01/190103152906.htm

Citation:

Artificial intelligence advances threaten privacy of health data
Study finds current laws and regulations do not safeguard individuals’ confidential health information
Date: January 3, 2019
Source: University of California – Berkeley
Summary:
Advances in artificial intelligence, including activity trackers, smartphones and smartwatches, threaten the privacy of people’s health data, according to new research.

Journal Reference:
Liangyuan Na, Cong Yang, Chi-Cheng Lo, Fangyuan Zhao, Yoshimi Fukuoka, Anil Aswani. Feasibility of Reidentifying Individuals in Large National Physical Activity Data Sets From Which Protected Health Information Has Been Removed With Use of Machine Learning. JAMA Network Open, 2018; 1 (8): e186040 DOI: 10.1001/jamanetworkopen.2018.6040

Here is a portion of the JAMA abstract:

Original Investigation
Health Policy
December 21, 2018
Feasibility of Reidentifying Individuals in Large National Physical Activity Data Sets From Which Protected Health Information Has Been Removed With Use of Machine Learning
Liangyuan Na, BA1; Cong Yang, BS2; Chi-Cheng Lo, BS2; et al Fangyuan Zhao, BS3; Yoshimi Fukuoka, PhD, RN4; Anil Aswani, PhD2
Author Affiliations Article Information
JAMA Netw Open. 2018;1(8):e186040. doi:10.1001/jamanetworkopen.2018.6040
Thomas H. McCoy Jr, MD; Michael C. Hughes, PhD
Key Points
Question Is it possible to reidentify physical activity data that have had protected health information removed by using machine learning?
Findings This cross-sectional study used national physical activity data from 14 451 individuals from the National Health and Nutrition Examination Surveys 2003-2004 and 2005-2006. Linear support vector machine and random forests reidentified the 20-minute-level physical activity data of approximately 80% of children and 95% of adults.
Meaning The findings of this study suggest that current practices for deidentifying physical activity data are insufficient for privacy and that deidentification should aggregate the physical activity data of many people to ensure individuals’ privacy.
Abstract
Importance Despite data aggregation and removal of protected health information, there is concern that deidentified physical activity (PA) data collected from wearable devices can be reidentified. Organizations collecting or distributing such data suggest that the aforementioned measures are sufficient to ensure privacy. However, no studies, to our knowledge, have been published that demonstrate the possibility or impossibility of reidentifying such activity data.
Objective To evaluate the feasibility of reidentifying accelerometer-measured PA data, which have had geographic and protected health information removed, using support vector machines (SVMs) and random forest methods from machine learning.
Design, Setting, and Participants In this cross-sectional study, the National Health and Nutrition Examination Survey (NHANES) 2003-2004 and 2005-2006 data sets were analyzed in 2018. The accelerometer-measured PA data were collected in a free-living setting for 7 continuous days. NHANES uses a multistage probability sampling design to select a sample that is representative of the civilian noninstitutionalized household (both adult and children) population of the United States.
Exposures The NHANES data sets contain objectively measured movement intensity as recorded by accelerometers worn during all walking for 1 week.
Main Outcomes and Measures The primary outcome was the ability of the random forest and linear SVM algorithms to match demographic and 20-minute aggregated PA data to individual-specific record numbers, and the percentage of correct matches by each machine learning algorithm was the measure…. https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2719130?resultClick=3

Here is the press release from UC Berkeley:

PUBLIC RELEASE: 3-JAN-2019
Artificial intelligence advances threaten privacy of health data
Study finds current laws and regulations do not safeguard individuals’ confidential health information
Advances in artificial intelligence have created new threats to the privacy of people’s health data, a new University of California, Berkeley, study shows.
Led by UC Berkeley engineer Anil Aswani, the study suggests current laws and regulations are nowhere near sufficient to keep an individual’s health status private in the face of AI development. The research was published Dec. 21 in the JAMA Network Open journal.
The findings show that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data, such as that collected by activity trackers, smartwatches and smartphones, and correlating it to demographic data.
The mining of two years’ worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked.
“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” said Aswani. “The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”
“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he added. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”
According to Aswani, the problem isn’t with the devices, but with how the information the devices capture can be misused and potentially sold on the open market.
“I’m not saying we should abandon these devices,” he said. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”
Though the study specifically looked at step data, the results suggest a broader threat to the privacy of health data.
“HIPAA regulations make your health care private, but they don’t cover as much as you think,” Aswani said. “Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”
Aswani said advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance.
“Ideally, what I’d like to see from this are new regulations or rules that protect health data,” he said. “But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing. The risk is that if people are not aware of what’s happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing.”
###
Co-authors of the study are Liangyuan Na of MIT; Cong Yang and Chi-Cheng Lo of UC Berkeley; Fangyuan Zhao of Tsinghua University in China; and Yoshimi Fukuoka of UCSF.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

RAND Corporation has information about health care privacy at https://www.rand.org/topics/health-information-privacy.html

StaySafeOnline described health care privacy issues in the article, Health Information Privacy – Why Should We Care?

• Health data is very personal and may contain information we wish to keep confidential (e.g., mental health records) or potentially impact employment prospects or insurance coverage (e.g., chronic disease or family health history).
• It is long living – an exposed credit card can be canceled, but your medical history stays with you a lifetime.
• It is very complete and comprehensive – the information health care organizations have about their patients includes not only medical data, but also insurance and financial account information. This could be personal information like Social Security numbers, addresses or even the names of next of kin. Such a wealth of data can be monetized by cyber adversaries in many ways.
• In our digital health care world, the reliable availability of accurate health data to clinicians is critical to care delivery and any disruption in access to that data can delay care or jeopardize diagnosis.
The privacy and security of health information is strictly regulated in the U.S. under federal laws, such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA), but also through various state laws and laws protecting individuals against discrimination based on genetic data….
For health care providers and insurers, there is typically no limitation for patients to disclose information about their health. Just as any patient can (and mostly should) share concerns about their health with family and friends, any patient can now easily share anything they want with the world via social media or join an online support group. Although these are generally positive steps that help an individual with health concerns find support and receive advice, we now need to be much more conscious about what
However, concerns about your health care provider’s ability to protect your data should not lead to patients withholding information. Even in this digital age, the patient-doctor trust relationship is still the most important aspect of our health care system – and that trust goes both ways: patients need to trust their providers with often intimate and personal information, and providers need to know that their patients are not withholding anything due to privacy concerns.
We have entered the new age of digital medicine and almost universal availability of information, leading to better diagnosis and more successful treatments, ultimately reducing suffering and extending lives. However, this great opportunity also comes with new risks and we all – health care providers and patients alike – need to be conscious about how we use this new technology and share information…. https://staysafeonline.org/blog/health-information-privacy-care/

Resources:

Artificial Intelligence Will Redesign Healthcare https://medicalfuturist.com/artificial-intelligence-will-redesign-healthcare

9 Ways Artificial Intelligence is Affecting the Medical Field https://www.healthcentral.com/slideshow/8-ways-artificial-intelligence-is-affecting-the-medical-field#slide=2

Where information leads to Hope. © Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©
http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©
http://drwildareviews.wordpress.com/

Dr. Wilda ©
https://drwilda.com/

University of Washington study: For $1000, anyone can purchase online ads to track your location and app use

19 Oct

“All happy families are alike; each unhappy family is unhappy in its own way.”
Leo Tolstoy, Anna Karenina

Tolstoy may not have been specifically talking about domestic violence, but each situation is unique. There is a specific story and specific journey for each victim, each couple, and each abuser. There is no predicted endpoint for domestic violence; each situation will have its own outcome.

Headlines regularly detail incidents of domestic violence involving sports figures and other prominent people. Domestic Violence is a societal problem. According to Safe Horizon:

The Victims
1 in 4 women will experience domestic violence during her lifetime.
Women experience more than 4 million physical assaults and rapes because of their partners, and men are victims of nearly 3 million physical assaults.
Women are more likely to be killed by an intimate partner than men
Women ages 20 to 24 are at greatest risk of becoming victims of domestic violence.
Every year, 1 in 3 women who is a victim of homicide is murdered by her current or former partner….. http://www.safehorizon.org/page/domestic-violence-statistics–facts-52.html

Abusers come in all races, classes, genders, religions and creeds.

Andy Greenberg reported in the Wired article, It Takes Just $1,000 to Track Someone’s Location With Mobile Ads:

When you consider the nagging privacy risks of online advertising, you may find comfort in the thought of a vast, abstract company like Pepsi or Nike viewing you as just one data point among millions. What, after all, do you have to hide from Pepsi? And why should that corporate megalith care about your secrets out of countless potential Pepsi drinkers? But an upcoming study has dissipated that delusion. It shows that ad-targeting can not only track you at the personal, individual level but also that it doesn’t take a corporation’s resources to seize upon that surveillance tool—just time, determination, and about a thousand dollars.
A team of security-focused researchers from the University of Washington has demonstrated just how deeply even someone with modest resources can exploit mobile advertising networks. An advertising-savvy spy, they’ve shown, can spend just a grand to track a target’s location with disturbing precision, learn details about them like their demographics and what apps they have installed on their phone, or correlate that information to make even more sensitive discoveries—say, that a certain twentysomething man has a gay dating app installed on his phone and lives at a certain address, that someone sitting next to the spy at a Starbucks took a certain route after leaving the coffee shop, or that a spy’s spouse has visited a particular friend’s home or business… https://www.wired.com/story/track-location-with-mobile-ads-1000-dollars-study/

Tracking a partner’s movements is one element of control in an abusive relationship.

Rachael Williams wrote in the Guardian article, Spyware and smartphones: how abusive men track their partners:

New technology is being developed so quickly, and social media pervades so many aspects of our lives, that it is hard to stay ahead, says Jennifer Perry, the chief executive of the Digital Trust, which supports victims of digital abuse. In fact, spyware, she reckons, is “yesterday’s technology” for tracking victims: “The easiest thing is to access the woman in the cloud. A man might buy a phone and set it up for his partner to be ‘helpful’. He knows the username and password. You have women who don’t even realise they have a cloud account in their smartphone.
“There is also an app you can buy that mirrors the phone on to a PC. The man can just sit at his computer and watch everything that happens on the phone.”
The technology is cheap and accessible, she says. And evading it is often not as simple as just turning the phone off. Perry usually advises women to take their sim card out, leave the phone with a friend until it can be cleaned, and use a cheap pay-as-you-go device in the meantime. But if her ex-partner owns the phone, it will never be safe.
Cloud storage is particularly problematic because it is linked to laptops and PCs, which, unlike phones, can have spyware installed on them remotely via email. “You often find that a woman had spyware put on to her computer remotely, so even if she changes the username and password for the cloud on her phone, the abuser can see that on the computer and get back in,” Perry says.
Perpetrators don’t just use this technology to find out where an escaping partner has gone; it is another tool for abuse when they’re together, too. “They will use the information to belittle or threaten the woman,” says Clare Laxton, public policy manager at Women’s Aid. “They’ll say: ‘Why were you at this restaurant? You’re cheating on me, I’m going to kill myself.’ It closes down that woman’s space, so she won’t want to go out and socialise, because she knows the abuse she’ll get when she gets home isn’t worth it. It’s all part of controlling her as much as possible….” https://www.theguardian.com/lifeandstyle/2015/jan/25/spyware-smartphone-abusive-men-track-partners-domestic-violence

Science Daily reported about privacy concerns:

Privacy concerns have long swirled around how much information online advertising networks collect about people’s browsing, buying and social media habits — typically to sell you something.
But could someone use mobile advertising to learn where you go for coffee? Could a burglar establish a sham company and send ads to your phone to learn when you leave the house? Could a suspicious employer see if you’re using shopping apps on work time?
The answer is yes, at least in theory. New University of Washington research, which will be presented Oct. 30 at the Association for Computing Machinery’s Workshop on Privacy in the Electronic Society, suggests that for roughly $1,000, someone with devious intent can purchase and target online advertising in ways that allow them to track the location of other individuals and learn what apps they are using….

Citation:

For $1000, anyone can purchase online ads to track your location and app use
Date: October 18, 2017
Source: University of Washington
Summary:
New research finds that for a budget of roughly $1000, it is possible for someone to track your location and app use by purchasing and targeting mobile ads. The team hopes to raise industry awareness about the potential privacy threat. https://www.sciencedaily.com/releases/2017/10/171018124131.htm

Here is the press release from the University of Washington:

October 18, 2017

For $1000, anyone can purchase online ads to track your location and app use
Jennifer Langston

UW News

New University of Washington research finds that for a budget of roughly $1000, it is possible for someone to track your location and app use by purchasing and targeting mobile ads. The team aims to raise industry awareness about the potential privacy threat.

Privacy concerns have long swirled around how much information online advertising networks collect about people’s browsing, buying and social media habits — typically to sell you something.

But could someone use mobile advertising to learn where you go for coffee? Could a burglar establish a sham company and send ads to your phone to learn when you leave the house? Could a suspicious employer see if you’re using shopping apps on work time?

The answer is yes, at least in theory. New University of Washington research, to be presented in a paper Oct. 30 at the Association for Computing Machinery’s Workshop on Privacy in the Electronic Society, suggests that for roughly $1,000, someone with devious intent can purchase and target online advertising in ways that allow them to track the location of other individuals and learn what apps they are using.
“Anyone from a foreign intelligence agent to a jealous spouse can pretty easily sign up with a large internet advertising company and on a fairly modest budget use these ecosystems to track another individual’s behavior,” said lead author Paul Vines, a recent doctoral graduate in the UW’s Paul G. Allen School of Computer Science & Engineering.

The research team set out to test whether an adversary could exploit the existing online advertising infrastructure for personal surveillance and, if so, raise industry awareness about the threat.

“Because it was so easy to do what we did, we believe this is an issue that the online advertising industry needs to be thinking about,” said co-author Franzi Roesner, co-director of the UW Security and Privacy Research Lab and an assistant professor in the Allen School. “We are sharing our discoveries so that advertising networks can try to detect and mitigate these types of attacks, and so that there can be a broad public discussion about how we as a society might try to prevent them.”

This map represents an individual’s morning commute. Red dots reflect the places where the UW computer security researchers were able to track that person’s movements by serving location-based ads: at home (real location not shown), a coffee shop, bus stop and office. The team found that a target needed to stay in one location for roughly four minutes before an ad was served, which is why no red dots appear along the individual’s bus commute (dashed line) or walking route (solid line.)University of Washington

The researchers discovered that an individual ad purchaser can, under certain circumstances, see when a person visits a predetermined sensitive location — a suspected rendezvous spot for an affair, the office of a company that a venture capitalist might be interested in or a hospital where someone might be receiving treatment — within 10 minutes of that person’s arrival. They were also able to track a person’s movements across the city during a morning commute by serving location-based ads to the target’s phone.

The team also discovered that individuals who purchase the ads could see what types of apps their target was using. That could potentially divulge information about the person’s interests, dating habits, religious affiliations, health conditions, political leanings and other potentially sensitive or private information.
Someone who wants to surveil a person’s movements first needs to learn the mobile advertising ID (MAID) for the target’s mobile phone. These unique identifiers that help marketers serve ads tailored to a person’s interests are sent to the advertiser and a number of other parties whenever a person clicks on a mobile ad. A person’s MAID also could be obtained by eavesdropping on an unsecured wireless network the person is using or by gaining temporary access to his or her WiFi router.
The UW team demonstrated that customers of advertising services can purchase a number of hyperlocal ads through that service, which will only be served to that particular phone when its owner opens an app in a particular spot. By setting up a grid of these location-based ads, the adversary can track the target’s movements if he or she has opened an app and remains in a location long enough for an ad to be served — typically about four minutes, the team found.
Importantly, the target does not have to click on or engage with the ad — the purchaser can see where ads are being served and use that information to track the target through space. In the team’s experiments, they were able to pinpoint a person’s location within about 8 meters.

“To be very honest, I was shocked at how effective this was,” said co-author Tadayoshi Kohno, an Allen School professor who has studied security vulnerabilities in products ranging from automobiles to medical devices. “We did this research to better understand the privacy risks with online advertising. There’s a fundamental tension that as advertisers become more capable of targeting and tracking people to deliver better ads, there’s also the opportunity for adversaries to begin exploiting that additional precision. It is important to understand both the benefits and risks with technologies.”

An individual could potentially disrupt the simple types of location-based attacks that the UW team demonstrated by frequently resetting the mobile advertising IDs in their phones — a feature that many smartphones now offer. Disabling location tracking within individual app settings could help, the researchers said, but advertisers still may be capable of harvesting location data in other ways.
On the industry side, mobile and online advertisers could help thwart these types of attacks by rejecting ad buys that target only a small number of devices or individuals, the researchers said. They also could develop and deploy machine learning tools to distinguish between normal advertising patterns and suspicious advertising behavior that looks more like personal surveillance.
The UW Security and Privacy Research Lab is a leader in evaluating potential security threats in emerging technologies, including telematics in automobiles, web browsers, DNA sequencing software and augmented reality, before they can be exploited by bad actors.

Next steps for the team include working with experts at the UW’s Tech Policy Lab to explore the legal and policy questions raised by this new form of potential intelligence gathering.

The research was funded by The National Science Foundation, The Tech Policy Lab and the Short-Dooley Professorship.

For more information, contact the research team at adint@cs.washington.edu.
Grant number: NSF: CNS-1463968

Resources:

Cell Phone Location Tracking Laws By State https://www.aclu.org/issues/privacy-technology/location-tracking/cell-phone-location-tracking-laws-state

Mobile Phone Safety for a Domestic Abuse Victim http://www.getdomesticviolencehelp.com/domestic-abuse-victim.html

Smartphones Are Used To Stalk, Control Domestic Abuse Victims http://www.npr.org/sections/alltechconsidered/2014/09/15/346149979/smartphones-are-used-to-stalk-control-domestic-abuse-victims

Where information leads to Hope. © Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©
http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©
http://drwildareviews.wordpress.com/

Dr. Wilda ©
https://drwilda.com/

U.S. Department of Education guidelines on student privacy

26 Feb

Many schools and districts are using cloud computing. Judith Hurwitz, Robin Bloor, Marcia Kaufman, and Fern Halper from Cloud Computing For Dummies wrote about cloud computing in What Is Cloud Computing?

Cloud computing is the next stage in the Internet’s evolution, providing the means through which everything — from computing power to computing infrastructure, applications, business processes to personal collaboration — can be delivered to you as a service wherever and whenever you need.
The “cloud” in cloud computing can be defined as the set of hardware, networks, storage, services, and interfaces that combine to deliver aspects of computing as a service. Cloud services include the delivery of software, infrastructure, and storage over the Internet (either as separate components or a complete platform) based on user demand. (See Cloud Computing Models for the lowdown on the way clouds are used.)
Cloud computing has four essential characteristics: elasticity and the ability to scale up and down, self-service provisioning and automatic deprovisioning, application programming interfaces (APIs), billing and metering of service usage in a pay-as-you-go model. (Cloud Computing Characteristics discusses these elements in detail.) This flexibility is what is attracting individuals and businesses to move to the cloud.
The world of the cloud has lots of participants:
•The end user who doesn’t have to know anything about the underlying technology.
•Business management who needs to take responsibility for the governance of data or services living in a cloud. Cloud service providers must provide a predictable and guaranteed service level and security to all their constituents. (Find out what providers have to consider in Cloud Computing Issues.)
•The cloud service provider who is responsible for IT assets and maintenance.
Cloud computing is offered in different forms: public clouds, private clouds, and hybrid clouds, which combine both public and private. (You can get a sense of the differences among these kinds of clouds in Deploying Public, Private, or Hybrids Clouds.)
Cloud computing can completely change the way companies use technology to service customers, partners, and suppliers…. http://www.dummies.com/how-to/content/what-is-cloud-computing.html

Moi wrote about cloud privacy concerns in Does ‘cloud storage’ affect student privacy rights? https://drwilda.com/2013/02/19/does-cloud-storage-affect-student-privacy-rights/

Benjamin Herold reported in the Education Week article, U.S. Education Department Issues Guidance on Student Data Privacy:

The new federal guidelines are non-binding and contain no new regulations, reflecting a desire to encourage “self-policing” by industry and better policies and practices by school systems as first steps towards shoring up students’ privacy protections.
Dozens of privacy-related bills are making their way through statehouses this spring, however, and U.S. Senator Edward Markey, a Democrat from Massachusetts who has been critical of the education department’s stance on privacy, said Monday he would soon introduce new federal legislation on the matter.
Reaction to the document from key stakeholder groups was swift, reflecting the growing urgency around data-security issues. A trade association for the software and digital content industries commended the department for an approach it said “affirms and reinforces the strong safeguards in current law,” while a leading parent advocate said the guidance “completely misses the point when it comes to addressing parental concerns about their children’s privacy and security.”
FERPA Questions
Much of the 14-page department document, titled “Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices,” focuses on the Family Educational Rights and Privacy Act, or FERPA….
The departmental guidance issued Tuesday, however, makes clear that FERPA and another relevant federal statute, the Protection of Pupil Rights Amendment, are somewhat limited in their power to prevent such outcomes in the new age of “big data” and ubiqitous digital learning tools.
Take, for example, the “metadata” collected on students via digital devices and online learning programs, which can include keystroke information, the time and place at which a device or program is being used, the type of device on which the service is being accessed, and more.
Under some circumstances, such metadata are not protected under FERPA and may thus eligible to be used for data-mining and other non-educational purposes.
According to the federal guidelines, vendors that have not collected any personally identifiable information on individual students may be permitted to use metadata for data-mining and other purposes.
And even when vendors have collected personally identifiable information on students, they may still be permitted to use metadata for their own purposes, provided those data are stripped of any identifying elements, and so long as the vendor received students’ information under an exception to FERPA that allows vendors to more easily be designated as “school officials…”
Privacy advocates, however, have criticized—and even sued—the department over its recent decisions to expand the definitions of who may be authorized to gain access to student data under FERPA.
“The guidance really underscores the fact that student privacy rights are under attack, and it was the [department’s] regulations that opened the door,” said Khaliah Barnes, an attorney with the nonprofit Electronic Privacy Information Center.
Best Practices
The uses of students’ personally identifiable information by third-party vendors can also be murky, according to the guidelines.
Under FERPA, parental consent is usually required for the disclosure of such information, although there are exceptions. Schools and districts are also supposed to maintain “direct control” over their data, even after it is passed to third parties—a requirement that is hugely complex given the massive amounts of data now being collected, the rise of cloud-based service providers, and the rapid-fire cycle of business start-ups, mergers, and acquisitions that mark today’s ed-tech landscape.
The new guidelines suggest that better contracting practices and school- and district-level policies are key to protecting student privacy amid all the confusion.
Among the best practices recommended by the department:
Maintain awareness of relevant federal, state, tribal, or local laws, particularly the Children’s Online Privacy Protection Act, which includes requirements for providing online educational services to children under 13.
Be aware of which online educational services are currently being used in your district. Conducting an inventory of all such services is one specific step districts can take.
Have policies and procedures to evaluate and approve proposed online educational services, including both formal contracts and no-cost software and that requires only click-through consent.
When possible, use a written contract or legal agreement. Provisions should be included for security and data stewardship; the collection of data; the use, retention, disclosure, and destruction of data; the right of parents and students to access and modify their data; and more.
Such reliance on district contracting processes and policy development could pose a problem, given the current state of such efforts. In December, Fordham University professor Joel Reidenberg published a scathing study of the shortcoming and vulnerabilities of most districts’ contracts with cloud-service providers. http://blogs.edweek.org/edweek/DigitalEducation/2014/02/us_ed_dept_issues_guidance_on_.html

See, http://law.fordham.edu/32158.htm

Here is the citation from the U.S. Department of Education:

Protecting Student Privacy While Using Online Educational Services
PTAC is pleased to announce the release of new guidance, “Protecting Student Privacy While Using Online Educational Services.” http://ptac.ed.gov/sites/default/files/Student%20Privacy%20and%20Online%20Educational%20Services%20%28February%202014%29.pdf This guidance should clarify questions related to student privacy and the use of educational technology in the classroom.
The Department of Education and PTAC will be holding a joint webinar on March 13 to review this guidance and solicit your input on it. To register for the webinar, please click here.
If you require a reasonable accommodation to participate in the webinar, please notify Ross Lemke at ross.lemke@ed.gov by March 6th. For those who are unable to join the webinar on March 13, a recording and transcript will be posted to the PTAC website.
http://ptac.ed.gov/

See, Testing the Waters of Cloud Computing http://www.scholastic.com/browse/article.jsp?id=3753288

Sean Cavanaugh reported in the Education Week article, Districts’ Use of Cloud Computing Brings Privacy Risks, Study Says:

School districts have become increasingly reliant on cloud-based technologies despite “substantial deficiencies” in policies governing those Web-based systems and their protection of private student data, a new study finds.
The study, released today by the Fordham Law School’s Center on Law and Information Policy, seeks to provide the first national examination of privacy and cloud computing in public schools. The study authors also put forward a series of recommendations to policymakers for ramping up safeguards on students’ private information.
Fordham researchers based their study on a national sample of public school districts, asking for detailed information from 54 urban, suburban, and rural systems around the country.
Among the information they sought: contracts between districts and technology vendors; policies governing privacy and computer use; and notices sent to parents about student privacy and districts’ use of free or paid, third-party consulting services.
The study concludes that privacy implications for districts’ use of cloud services are “poorly understood, non-transparent, and weakly governed.”
Only 25 percent of the districts examined made parents aware of the use of cloud services, according to the study. Twenty percent do not have policies governing the use of those services, and a large plurality of districts have “rampant gaps” in their documentation of privacy policies in contracts and other forms.
To make matters worse, districts often relinquish control of student information when using cloud services, and do not have contracts or agreements setting clear limits on the disclosure, sale, and marketing of that data, the Fordham researchers say.
The Fordham study concludes that districts, policymakers, and vendors should consider taking a number of steps to increase privacy protections, including:
• Providing parents with sufficient notice of the transfer of student information to cloud-service providers, and assuring that parental consent is sought when required by federal law;
• Improving contracts between private vendors and districts to remove ambiguity and provide much more specific information on the disclosure and marketing of student data;
• Setting clearer policies on data governance within districts, which includes establishing rules barring employees from using cloud services not approved by districts. States and large districts should also hire “chief privacy officers” responsible for maintaining data protections;
• Establishing a national research center and clearinghouse to study privacy issues, and draft and store model contracts on privacy issues. The center should be “independent of commercial interests to assure objectivity,” the study authors said.
“School districts throughout the country are embracing the use of cloud computing services for important educational goals, but have not kept pace with appropriate safeguards for the personal data of school children,” said Joel Reidenberg, a professor at Fordham’s law school who worked on the study, in a statement accompanying its release. “There are critical actions that school districts and vendors must take to address the serious deficiences in privacy protection….” http://blogs.edweek.org/edweek/DigitalEducation/2013/12/fewer.html?intc=es

Citation:

Center on Law and Information Policy
Privacy and Cloud Computing in Public Schools
Joel R. Reidenberg, Fordham University School of Law
N. Cameron Russell, Fordham University School of Law
Jordan Kovnot, Fordham University School of Law
Thomas B. Norton, Fordham University School of Law
Ryan Cloutier, Fordham University School of Law
Daniela Alvarado, Fordham University School of Law
Download Full Text (760 KB)
http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1001&context=clip

There is a complex intertwining of laws which often prevent school officials from disclosing much about students.

Resources:

What cloud computing really means
http://www.infoworld.com/d/cloud-computing/what-cloud-computing-really-means-031

What Is Cloud Computing?
http://www.pcmag.com/article2/0,2817,2372163,00.asp

FERPA General Guidance for Students
http://ed.gov/policy/gen/guid/fpco/ferpa/students.html

No Child Left Behind A Parents Guide
http://ed.gov/parents/academic/involve/nclbguide/parentsguide.pdf

Related:

Data mining in education
https://drwilda.com/2012/07/19/data-mining-in-education/

Who has access to student records?
https://drwilda.com/2012/06/11/who-has-access-to-student-records/

Where information leads to Hope. © Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©
http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©
http://drwildareviews.wordpress.com/

Dr. Wilda © https://drwilda.com/

Fordham Center on Law and Information Policy study: Cloud computing poses privacy risks for school information

15 Dec

Many schools and districts are using cloud computing. Judith Hurwitz, Robin Bloor, Marcia Kaufman, and Fern Halper from Cloud Computing For Dummies wrote about cloud computing in What Is Cloud Computing?

Cloud computing is the next stage in the Internet’s evolution, providing the means through which everything — from computing power to computing infrastructure, applications, business processes to personal collaboration — can be delivered to you as a service wherever and whenever you need.
The “cloud” in cloud computing can be defined as the set of hardware, networks, storage, services, and interfaces that combine to deliver aspects of computing as a service. Cloud services include the delivery of software, infrastructure, and storage over the Internet (either as separate components or a complete platform) based on user demand. (See Cloud Computing Models for the lowdown on the way clouds are used.)
Cloud computing has four essential characteristics: elasticity and the ability to scale up and down, self-service provisioning and automatic deprovisioning, application programming interfaces (APIs), billing and metering of service usage in a pay-as-you-go model. (Cloud Computing Characteristics discusses these elements in detail.) This flexibility is what is attracting individuals and businesses to move to the cloud.
The world of the cloud has lots of participants:
•The end user who doesn’t have to know anything about the underlying technology.
•Business management who needs to take responsibility for the governance of data or services living in a cloud. Cloud service providers must provide a predictable and guaranteed service level and security to all their constituents. (Find out what providers have to consider in Cloud Computing Issues.)
•The cloud service provider who is responsible for IT assets and maintenance.
Cloud computing is offered in different forms: public clouds, private clouds, and hybrid clouds, which combine both public and private. (You can get a sense of the differences among these kinds of clouds in Deploying Public, Private, or Hybrids Clouds.)
Cloud computing can completely change the way companies use technology to service customers, partners, and suppliers…. http://www.dummies.com/how-to/content/what-is-cloud-computing.html

Moi wrote about cloud privacy concerns in Does ‘cloud storage’ affect student privacy rights?

Mike Bock wrote the intriguing Education Week article, Districts Move to the Cloud to Power Up, Save Money:

There are serious questions and concerns, however, about moving computer operations to the cloud. Chief among those worries is the security of sensitive data, such as student records. That concern alone has led some district information-technology leaders to remain hesitant about moving in that direction….
Bandwidth Needs Grow
But for districts with the bandwidth infrastructure in place, experts say cloud approaches offer lower costs and less time spent on maintenance. Since many cloud-based applications are offered either for free or for a monthly subscription rate, upfront costs for software are typically lower than the standard model of purchasing software and installing it across the district….
Privacy Concerns
But there is a trade-off. If a district puts its student-information system in a cloud environment, the cloud provider has access to information about all students.
Districts need to be protective and aware of that reality and must follow requirements outlined in state and federal policy, including the Children’s Online Privacy Protection Act, a federal law that requires that websites obtain parents’ consent before collecting personal details about users, such as home addresses or email addresses, from children younger than 13…. http://www.edweek.org/dd/articles/2013/02/06/02cloud.h06.html?tkn=PYMF4hhA6EcyMvzcq4T6AaBDFNeT6fynaPVn&cmp=clp-edweek&intc=es
School districts have to balance the rights of students to an education with the need to know of other parties. https://drwilda.com/2013/02/19/does-cloud-storage-affect-student-privacy-rights/

Kalyani M. posted Privacy Issues For Schools Using The Cloud at Spideroak blog:

While use of cloud services help schools to save thousands of dollars, the data security and privacy risks presented by these services cannot be ignored. The survey report by SafeGov.org says “there are a number of areas where advertising-oriented cloud services may jeopardize the privacy of data subjects in schools, even when ad-serving is nominally disabled. Threats to student online privacy occasioned by the use of such services in the school environment include the following:
•Lack of privacy policies suitable for schools: By failing to adopt privacy policies specifically crafted to the needs of schools, cloud providers may deliberately or inadvertently force schools to accept policies or terms of service that authorise user profiling and online behavioural advertising.
•Blurred mechanisms for user consent: Some cloud privacy policies, even though based on contractual relationships between cloud providers and schools, stipulate that individual data subjects (students) are also bound by these policies, even when these subjects have not had the opportunity to grant or withhold their consent.
• Potential for commercial data mining: When school cloud services derive from ad-supported consumer services that rely on powerful user profiling and tracking algorithms, it may be technically difficult for the cloud provider to turn off these functions even when ads are not being served.
•User interfaces that don’t separate ad-free and ad-based services: By failing to create interfaces that distinguish clearly between ad-based and ad-free services, cloud providers may lure school children into moving unwittingly from ad-free services intended for school use (such as email or online collaboration) to consumer ad-driven services that engage in highly intrusive processing of personal information (such as online video, social networking or even basic search).
•Contracts that don’t guarantee ad-free services: By using ambiguously worded contracts and including the option to serve ads in their services, some cloud providers leave the door open to future imposition of online advertising as a condition for allowing schools to continue receiving cloud services for free.”
SafeGov has also sought support from European Data Protection Authorities to implement rules for both cloud service providers and schools. As per these rules or codes of conduct-targeted advertising in schools and processing or secondary use of data for advertising purposes should be banned. In the privacy policy agreement contract between the schools and service providers it should be clearly stated that student data would not be used for data mining and advertisement purposes.
Keeping all these things in mind, the schools should make sure the data would be stored and managed by the service providers before moving to cloud services. They should demand assurance from the service providers that the information collected by them will not be used for data mining, targeted advertising or sold to third parties… https://spideroak.com/privacypost/cloud-security/privacy-issues-when-schools-use-cloud-services/

See, Testing the Waters of Cloud Computing http://www.scholastic.com/browse/article.jsp?id=3753288

Sean Cavanaugh reported in the Education Week article, Districts’ Use of Cloud Computing Brings Privacy Risks, Study Says:

School districts have become increasingly reliant on cloud-based technologies despite “substantial deficiencies” in policies governing those Web-based systems and their protection of private student data, a new study finds.
The study, released today by the Fordham Law School’s Center on Law and Information Policy, seeks to provide the first national examination of privacy and cloud computing in public schools. The study authors also put forward a series of recommendations to policymakers for ramping up safeguards on students’ private information.
Fordham researchers based their study on a national sample of public school districts, asking for detailed information from 54 urban, suburban, and rural systems around the country.
Among the information they sought: contracts between districts and technology vendors; policies governing privacy and computer use; and notices sent to parents about student privacy and districts’ use of free or paid, third-party consulting services.
The study concludes that privacy implications for districts’ use of cloud services are “poorly understood, non-transparent, and weakly governed.”
Only 25 percent of the districts examined made parents aware of the use of cloud services, according to the study. Twenty percent do not have policies governing the use of those services, and a large plurality of districts have “rampant gaps” in their documentation of privacy policies in contracts and other forms.
To make matters worse, districts often relinquish control of student information when using cloud services, and do not have contracts or agreements setting clear limits on the disclosure, sale, and marketing of that data, the Fordham researchers say.
The Fordham study concludes that districts, policymakers, and vendors should consider taking a number of steps to increase privacy protections, including:
• Providing parents with sufficient notice of the transfer of student information to cloud-service providers, and assuring that parental consent is sought when required by federal law;
• Improving contracts between private vendors and districts to remove ambiguity and provide much more specific information on the disclosure and marketing of student data;
• Setting clearer policies on data governance within districts, which includes establishing rules barring employees from using cloud services not approved by districts. States and large districts should also hire “chief privacy officers” responsible for maintaining data protections;
• Establishing a national research center and clearinghouse to study privacy issues, and draft and store model contracts on privacy issues. The center should be “independent of commercial interests to assure objectivity,” the study authors said.
“School districts throughout the country are embracing the use of cloud computing services for important educational goals, but have not kept pace with appropriate safeguards for the personal data of school children,” said Joel Reidenberg, a professor at Fordham’s law school who worked on the study, in a statement accompanying its release. “There are critical actions that school districts and vendors must take to address the serious deficiences in privacy protection….” http://blogs.edweek.org/edweek/DigitalEducation/2013/12/fewer.html?intc=es

Citation:

Center on Law and Information Policy
Privacy and Cloud Computing in Public Schools
Joel R. Reidenberg, Fordham University School of Law
N. Cameron Russell, Fordham University School of Law
Jordan Kovnot, Fordham University School of Law
Thomas B. Norton, Fordham University School of Law
Ryan Cloutier, Fordham University School of Law
Daniela Alvarado, Fordham University School of Law
Download Full Text (760 KB)
http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1001&context=clip
Description
Today, data driven decision-making is at the center of educational policy debates in the United States. School districts are increasingly turning to rapidly evolving technologies and cloud computing to satisfy their educational objectives and take advantage of new opportunities for cost savings, flexibility, and always-available service among others. As public schools in the United States rapidly adopt cloud-computing services, and consequently transfer increasing quantities of student information to third-party providers, privacy issues become more salient and contentious. The protection of student privacy in the context of cloud computing is generally unknown both to the public and to policy-makers. This study thus focuses on K-12 public education and examines how school districts address privacy when they transfer student information to cloud computing service providers. The goals of the study are threefold: first, to provide a national picture of cloud computing in public schools; second, to assess how public schools address their statutory obligations as well as generally accepted privacy principles in their cloud service agreements; and, third, to make recommendations based on the findings to improve the protection of student privacy in the context of cloud computing. Fordham CLIP selected a national sample of school districts including large, medium and small school systems from every geographic region of the country. Using state open public record laws, Fordham CLIP requested from each selected district all of the district’s cloud service agreements, notices to parents, and computer use policies for teachers. All of the materials were then coded against a checklist of legal obligations and privacy norms. The purpose for this coding was to enable a general assessment and was not designed to provide a compliance audit of any school district nor of any particular vendor.
Publication Date
12-13-2013
Rights
© 2013. Fordham Center on Law and Information Policy. This study may be reproduced, in whole or in part, for educational and non-commercial purposes provided that attribution to Fordham CLIP is included.
Publisher
Fordham Center on Law and Information Policy
City
New York
Keywords
children, education, cloud computing, school, FERPA, PPRA, COPPA, privacy, Joel Reidenberg, Cameron Russell, Fordham, CLIP
Privacy and Cloud Computing in Public Schools
Included in Communications Law Commons

There is a complex intertwining of laws which often prevent school officials from disclosing much about students.

According to Fact Sheet 29: Privacy in Education: Guide for Parents and Adult-Age Students,Revised September 2010 the major laws governing disclosure about student records are:

What are the major federal laws that govern the privacy of education records?
◦Family Educational Rights and Privacy Act (FERPA) 20 USC 1232g (1974)
◦Protection of Pupil’s Rights Amendments (PPRA) 20 USC 1232h (1978)
◦No Child Left Behind Act of 2001, Pub. L. 107-110, 115 STAT. 1425 (January 2002)
◦USA Patriot Act, P.L. 107-56 (October 26, 2001)
◦Privacy Act of 1974, 5 USC Part I, Ch. 5, Subch. 11, Sec. 552
◦Campus Sex Crimes Prevention Act (Pub. L. 106-386)
FERPA is the best known and most influential of the laws governing student privacy. Oversight and enforcement of FERPA rests with the U.S. Department of Education. FERPA has recently undergone some changes since the enactment of the No Child Left Behind Act and the USA Patriot Act…. https://www.privacyrights.org/fs/fs29-education.htm

The Fordham study indicates that many schools and districts have not fully analyzed student privacy concerns in their rush to the cloud.

Resources:
What cloud computing really means http://www.infoworld.com/d/cloud-computing/what-cloud-computing-really-means-031

What Is Cloud Computing? http://www.pcmag.com/article2/0,2817,2372163,00.asp

FERPA General Guidance for Students http://ed.gov/policy/gen/guid/fpco/ferpa/students.html

No Child Left Behind A Parents Guide http://ed.gov/parents/academic/involve/nclbguide/parentsguide.pdf

Related:
Data mining in education https://drwilda.com/2012/07/19/data-mining-in-education/

Who has access to student records? https://drwilda.com/2012/06/11/who-has-access-to-student-records/

Where information leads to Hope. © Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©
http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews © http://drwildareviews.wordpress.com/

Dr. Wilda © https://drwilda.com/

Does ‘cloud storage’ affect student privacy rights?

19 Feb

Moi wrote about student privacy in Who has access to student records?

Moi discussed the The Family Educational Rights and Privacy Act (FERPA) in The Federal Educational Rights and Privacy Act balancing act:

Schools all over the country are challenged by students who are violent, disruptive, and sometimes dangerous. Christine Clarridge, Seattle Times staff reporter reports in the Seattle Times article, Student-privacy laws complicate schools’ ability to prevent attacks which was about an unprovoked assault in a high school restroom which almost killed two students.

Five months before she allegedly attacked two schoolmates with a knife, nearly killing one, a Snohomish High School student underwent counseling after she threatened to kill another student’s boyfriend.

The 15-year-old Snohomish girl was allowed to return to school only after she presented proof she had attended counseling.

The earlier threats would have never been made public if the information wasn’t contained in court documents charging the girl with first-degree attempted murder and first-degree assault in last Monday’s attack.

Some Snohomish parents were surprised to learn of the earlier threat and have expressed concern that they weren’t notified.

But student information, including mental-health records, is tightly held by school districts because of federal privacy laws. The district says it cannot even discuss whether counselors or teachers were made aware of the earlier threats because of privacy laws.

The case underscores the delicate and complicated balancing act faced by schools in their efforts to meet the educational and privacy rights of individual students, as well as their need to ensure the safety of the larger student body. http://seattletimes.nwsource.com/html/localnews/2016643796_schoolsafety30m.html

There is a complex intertwining of laws which often prevent school officials from disclosing much about students.

According to Fact Sheet 29: Privacy in Education: Guide for Parents and Adult-Age Students,Revised September 2010 the major laws governing disclosure about student records are:

What are the major federal laws that govern the privacy of education records?

  • Family Educational Rights and Privacy Act (FERPA) 20 USC 1232g (1974)

  • Protection of Pupil’s Rights Amendments (PPRA) 20 USC 1232h (1978)

  • No Child Left Behind Act of 2001, Pub. L. 107-110, 115 STAT. 1425 (January 2002)

  • USA Patriot Act, P.L. 107-56 (October 26, 2001)

  • Privacy Act of 1974, 5 USC Part I, Ch. 5, Subch. 11, Sec. 552

  • Campus Sex Crimes Prevention Act (Pub. L. 106-386)

FERPA is the best known and most influential of the laws governing student privacy. Oversight and enforcement of FERPA rests with the U.S. Department of Education. FERPA has recently undergone some changes since the enactment of the No Child Left Behind Act and the USA Patriot Act…. https://www.privacyrights.org/fs/fs29-education.htm

https://drwilda.wordpress.com/2011/10/30/the-federal-educational-rights-and-privacy-act-balancing-act/

Still, schools collect a lot of information about students.

Mike Bock wrote the intriguing Education Week article, Districts Move to the Cloud to Power Up, Save Money:

There are serious questions and concerns, however, about moving computer operations to the cloud. Chief among those worries is the security of sensitive data, such as student records. That concern alone has led some district information-technology leaders to remain hesitant about moving in that direction….

Bandwidth Needs Grow

But for districts with the bandwidth infrastructure in place, experts say cloud approaches offer lower costs and less time spent on maintenance. Since many cloud-based applications are offered either for free or for a monthly subscription rate, upfront costs for software are typically lower than the standard model of purchasing software and installing it across the district….

Privacy Concerns

But there is a trade-off. If a district puts its student-information system in a cloud environment, the cloud provider has access to information about all students.

Districts need to be protective and aware of that reality and must follow requirements outlined in state and federal policy, including the Children’s Online Privacy Protection Act, a federal law that requires that websites obtain parents’ consent before collecting personal details about users, such as home addresses or email addresses, from children younger than 13.

“You want to make sure you understand the company you’re dealing with and look into how they deal with privacy concerns,” says Atkinson-Shorey.

Paul Potter, the director of technological infrastructure for the 3,150-student Tomah, Wis., school system, says districts that have staff members with computer-programming backgrounds might want to consider developing their own cloud applications if they find that their needs aren’t being met by some of the more popular cloud-computing providers….

http://www.edweek.org/dd/articles/2013/02/06/02cloud.h06.html?tkn=PYMF4hhA6EcyMvzcq4T6AaBDFNeT6fynaPVn&cmp=clp-edweek&intc=es

School districts have to balance the rights of students to an education with the need to know of other parties.

Resources:

FERPA General Guidance for Students

http://ed.gov/policy/gen/guid/fpco/ferpa/students.html

No Child Left Behind A Parents Guide

http://ed.gov/parents/academic/involve/nclbguide/parentsguide.pdf

Related:

Data mining in education                                                                  https://drwilda.com/2012/07/19/data-mining-in-education/

Who has access to student records?                                 https://drwilda.com/2012/06/11/who-has-access-to-student-records/

Where information leads to Hope. ©                 Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©                          http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©                                                 http://drwildareviews.wordpress.com/

Dr. Wilda ©                                                                                      https://drwilda.com/

‘Big Brother’ and the schools

24 Nov

Moi wrote about that Texas “Big Brother” in Texas digital school ID: ‘Big brother’ or the ‘mark of the beast’?

There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever they wanted to. You had to live—did live, from habit that became instinct—in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized.”

George Orwell, 1984

Huffington Post is reporting in the article, Texas School District Reportedly Threatening Students Who Refuse Tracking ID, Can’t Vote For Homecoming:

Weeks after Northside Independent School District in San Antonio rolled out its new “smart” IDs that tracks students’ geographic locations, the community is still at odds with the program.

The “Student Locator Project,” which is slated to eventually reach 112 Texas schools and close to 100,000 students, is in trial stages in two Northside district schools. In an effort to reduce truancy, the district has issued new student IDs with an embedded radio-frequency identification (RFID) chip that tracks the location of a student at all times.

The program officially launched October 1 at John Jay High School and Anson Jones Middle School. Without the badges — required to be worn around the neck — students cannot access common areas like the cafeteria or library, and cannot purchase tickets to extracurricular activities. WND reports that the district has threatened to suspend, fine or involuntarily transfer students who fail to comply and officials have noted that “there will be consequences for refusal to wear an ID card as we begin to move forward with full implementation.”

Parents and students from the schools spoke out against the project last month. But now, WND is reporting that schools are taking the restrictions one step further.

John Jay High School sophomore Andrea Hernandez refuses to use the new IDs, citing religious beliefs and instead sticking with her old badge from previous years, calling the tracking devices the “mark of the beast.” She tells Salon that the new badges make her uncomfortable and are an invasion of her privacy.

But to add to her restricted school grounds access, the teen says she was barred from voting for homecoming king and queen.

I had a teacher tell me I would not be allowed to vote because I did not have the proper voter ID,” she told WND. “I had my old student ID card which they originally told us would be good for the entire four years we were in school. He said I needed the new ID with the chip in order to vote.”

If successful, the tracking program could save the district as much as $175,000 lost daily to low attendance figures, which in part determine school funding. Higher attendance could lead to more state funding in the neighborhood of $1.7 million. A statement on the school district’s website lays out the program’s goals: to increase student safety and security, increase attendance and offer a multi-purpose “smart” student ID card that streamlines grounds access and purchasing power.

While uncommon, RFID chips are not new to school IDs, according to Wired. Schools in Houston launched a monitoring program as early as 2004, and a federally funded preschool in California started placing RFID chips in children’s clothes two years ago. Numerous districts have also considered similar programs, but without making them mandatory. http://www.huffingtonpost.com/2012/10/08/texas-school-district-rep_n_1949415.html?utm_hp_ref=email_share

What one might ask would cause a school district to do that ‘big brother’ thing. It’s the money, stupid. According to the Texas Tribune’s Maurice Chammah and Nick Swartsell writing in the article, Student IDs That Track the Students which was published in the New York Times:

In Texas, school finance is a numbers game: schools receive money based on the number of students counted in their homeroom classes each morning. At Anson Jones, as at other schools, many students were in school but not in homeroom, so they were not counted and the district lost money, said Pascual Gonzalez, a spokesman for the district.

We were leaving money on the table,” he said, adding that the district expects a $2 million return on an initial investment of $261,000 in the technology at two pilot schools. [Emphasis Added] http://www.nytimes.com/2012/10/07/us/in-texas-schools-use-ids-to-track-students.html?emc=eta1&_r=0 http://drwildaoldfart.wordpress.com/2012/10/09/texas-digital-school-id-big-brother-or-the-mark-of-the-beast/

Students who refuse to be monitored are being expelled.

Aaron Dykes writes at Infowars.Com in the article, Student Expelled for Refusing Location Tracking RFID Badge:

After months of protesting a policy requiring high school students to wear an RFID-enabled ID badge around their necks at all times, Andrea Hernandez is being involuntarily withdrawn from John Jay High School in San Antonio effective November 26th, according to a letter sent by the district that has now been made public.

The letter, sent on November 13, informs her father that the Smart ID program, which was phased in with the new school year, is now in “full implementation” and requires all students to comply by wearing the location-tracking badges.

Since Andrea Hernandez has refused to wear the badge, she is being withdrawn from the magnet school and her program at the Science and Engineering Academy, and instead will have to attend William Howard Taft HS, which is not currently involved in the ID scheme, unless she changes her position.

Civil liberties lawyers at the Rutherford Institute told Infowars.com that they are in the process of filing a temporary restraining order petition to prevent the school from kicking Hernandez out until further appeals can be made to resolve the matter. Representatives for John Jay did not return calls for comment by the time of publishing.

Andrea, backed by her family, has claimed the policy violates her religious beliefs and unduly infringes on her privacy. The controversial ID badge includes the photo and name of each student, a barcode tied to the student’s social security number, as well as an RFID chip which pinpoints the exact location of the individual student, including after hours and when the student leaves campus.

The battle over the IDs has been an ongoing saga. The Hernandez family has previously attended several school board meetings, organized protests and filed formal grievances with the district over the matter, and has been backed by numerous civil rights advocates.

Infowars reporters covered a protest that took place in early October, following up with appearances by the Hernandez family on the Alex Jones Show and the Infowars Nightly News programs.

Letter from John Jay High School withdrawing Andrea Hernandez for not submitting to the RFID tracking ID badges.


http://www.infowars.com/student-expelled-for-refusing-location-tracking-rfid-badge/

For money, you would sell your soul.
 Sophocles

Resources:

Big Brother invades our classrooms                                     http://www.salon.com/2012/10/08/big_brother_invades_our_classrooms/

ACLU documents show increasing phone and internet surveillance by Department of Justice                                                       http://www.theverge.com/2012/9/27/3418420/department-of-justice-surveillance-increase-aclu

Related:

Who has access to student records?                               https://drwilda.com/tag/student-privacy-laws-complicate-schools-ability-to-prevent-attacks/

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART © http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©                           http://drwildareviews.wordpress.com/

Dr. Wilda ©                                                                                 https://drwilda.com/

Data mining in education

19 Jul

Marc Parry has written a fascinating Chronicle of Higher Education article which was also published in the New York Times, Big Data on Campus:

With 72,000 students, A.S.U. is both the country’s largest public university and a hotbed of data-driven experiments. One core effort is a degree-monitoring system that keeps tabs on how students are doing in their majors. Stray off-course and a student may have to switch fields.

And while not exactly matchmaking, Arizona State takes an interest in students’ social lives, too. Its Facebook app mines profiles to suggest friends. One classmate shares eight things in common with Ms. Allisone, who “likes” education, photography and tattoos. Researchers are even trying to figure out social ties based on anonymized data culled from swipes of ID cards around the Tempe campus.

This is college life, quantified.

Data mining hinges on one reality about life on the Web: what you do there leaves behind a trail of digital breadcrumbs. Companies scoop those up to tailor services, like the matchmaking of eHarmony or the book recommendations of Amazon. Now colleges, eager to get students out the door more efficiently, are awakening to the opportunities of so-called Big Data.

The new breed of software can predict how well students will do before they even set foot in the classroom. It recommends courses, Netflix-style, based on students’ academic records.

Data diggers hope to improve an education system in which professors often fly blind. That’s a particular problem in introductory-level courses, says Carol A. Twigg, president of the National Center for Academic Transformation. “The typical class, the professor rattles on in front of the class,” she says. “They give a midterm exam. Half the kids fail. Half the kids drop out. And they have no idea what’s going on with their students.”

As more of this technology comes online, it raises new tensions. What role does a professor play when an algorithm recommends the next lesson? If colleges can predict failure, should they steer students away from challenges? When paths are so tailored, do campuses cease to be places of exploration? http://www.nytimes.com/2012/07/22/education/edlife/colleges-awakening-to-the-opportunities-of-data-mining.html?emc=eta1

For a really good explanation of “data mining” go to Alexander Furnas’ Atlantic article.

In Everything You Wanted to Know About Data Mining but Were Afraid to Ask, Furnas writes:

To most of us data mining goes something like this: tons of data is collected, then quant wizards work their arcane magic, and then they know all of this amazing stuff. But, how? And what types of things can they know? Here is the truth: despite the fact that the specific technical functioning of data mining algorithms is quite complex — they are a black box unless you are a professional statistician or computer scientist — the uses and capabilities of these approaches are, in fact, quite comprehensible and intuitive.

For the most part, data mining tells us about very large and complex data sets, the kinds of information that would be readily apparent about small and simple things. For example, it can tell us that “one of these things is not like the other” a la Sesame Street or it can show us categories and then sort things into pre-determined categories. But what’s simple with 5 datapoints is not so simple with 5 billion datapoints….

Discovering information from data takes two major forms: description and prediction. At the scale we are talking about, it is hard to know what the data shows. Data mining is used to simplify and summarize the data in a manner that we can understand, and then allow us to infer things about specific cases based on the patterns we have observed. Of course, specific applications of data mining methods are limited by the data and computing power available, and are tailored for specific needs and goals. However, there are several main types of pattern detection that are commonly used. These general forms illustrate what data mining can do. http://www.theatlantic.com/technology/archive/2012/04/everything-you-wanted-to-know-about-data-mining-but-were-afraid-to-ask/255388/

With the ability to collect vast amounts of information comes the question of what about privacy and what is the definition of privacy?

Ljiljana Brankovic and Vladimir Estivill-Castro discuss privacy issues in Privacy Issues in Knowledge Discovery:

Recent developments in information technology have enabled collection and processing of vast

amounts of personal data, such as criminal records, shopping habits, credit and medical history, and

driving records. This information is undoubtedly very useful in many areas, including medical

research, law enforcement and national security. However, there is an increasing public concern

about the individuals’ privacy. Privacy is commonly seen as the right of individuals to control

information about themselves. The appearance of technology for Knowledge Discovery and Data

Mining (KDDM) has revitalized concern about the following general privacy issues:

· secondary use of the personal information,

· handling misinformation, and

· granulated access to personal information.

They demonstrate that existing privacy laws and policies are well behind the developments in

technology, and no longer offer adequate protection.

We also discuss new privacy threats posed KDDM, which includes massive data collection, data

warehouses, statistical analysis and deductive learning techniques. KDDM uses vast amounts of

data to generate hypotheses and discover general patterns. KDDM poses the following new

challenges to privacy.

· stereotypes,

· guarding personal data from KDDM researchers,

· individuals from training sets, and

· combination of patterns.

http://www.ict.griffith.edu.au/~vlad/teaching/kdd.d/readings.d/aice99.pdf

In Who has access to student records? Moi said:

There is a complex intertwining of laws which often prevent school officials from disclosing much about students.

According to Fact Sheet 29: Privacy in Education: Guide for Parents and Adult-Age Students,Revised September 2010 the major laws governing disclosure about student records are:

What are the major federal laws that govern the privacy of education records?

  • Family Educational Rights and Privacy Act (FERPA) 20 USC 1232g (1974)

  • Protection of Pupil’s Rights Amendments (PPRA) 20 USC 1232h (1978)

  • No Child Left Behind Act of 2001, Pub. L. 107-110, 115 STAT. 1425 (January 2002)

  • USA Patriot Act, P.L. 107-56 (October 26, 2001)

  • Privacy Act of 1974, 5 USC Part I, Ch. 5, Subch. 11, Sec. 552

  • Campus Sex Crimes Prevention Act (Pub. L. 106-386)

FERPA is the best known and most influential of the laws governing student privacy. Oversight and enforcement of FERPA rests with the U.S. Department of Education. FERPA has recently undergone some changes since the enactment of the No Child Left Behind Act and the USA Patriot Act….https://www.privacyrights.org/fs/fs29-education.htm

https://drwilda.wordpress.com/2012/06/11/who-has-access-to-student-records/

See, The Federal Educational Rights and Privacy Act balancing act https://drwilda.wordpress.com/2011/10/30/the-federal-educational-rights-and-privacy-act-balancing-act/

Still, schools collect a lot of information about students.

Jason Koebler has written an interesting U.S. News article, Who Should Have Access to Student Records?

Since “No Child Left Behind” was passed 10 years ago, states have been required to ramp up the amount of data they collect about individual students, teachers, and schools. Personal information, including test scores, economic status, grades, and even disciplinary problems and student pregnancies, are tracked and stored in a kind of virtual “permanent record” for each student.

But parents and students have very little access to that data, according to a report released Wednesday by the Data Quality Campaign, an organization that advocates for expanded data use.

All 50 states and Washington, D.C. collect long term, individualized data on students performance, but just eight states allow parents to access their child’s permanent record. Forty allow principals to access the data and 28 provide student-level info to teachers.

Education experts, including Secretary of Education Arne Duncan and former Washington, D.C., Schools Chancellor Michelle Rhee, argue that education officials can use student data to assess teachers—if many students’ test scores are jumping in a specific teacher’s class, odds are that teacher is doing a good job.

Likewise, teachers can use the data to see where a student may have struggled in the past and can tailor instruction to suit his needs.

At an event discussing the Data Quality Campaign report Wednesday, Rhee said students also used the information to try to out-achieve each other.

The data can be an absolute game changer,” she says. “If you have the data, and you can invest and engage children and their families in this data, it can change a culture quickly.”Privacy experts say the problem is that states collect far more information than parents expect, and it can be shared with more than just a student’s teacher or principal.“When you have a system that’s secret [from parents] and you can put whatever you want into it, you can have things going in that’ll be very damaging,” says Lillie Coney, associate director of the Electronic Privacy Information Center. “When you put something into digital form, you can’t control where that’ll end up.”

According to a 2009 report by the Fordham University Center on Law and Information Policy, some states store student’s social security numbers, family financial information, and student pregnancy data. Nearly half of states track students’ mental health issues, illnesses, and jail sentences.Without access to their child’s data, parents have no way of knowing what teachers and others are learning about them.

http://www.usnews.com/news/articles/2012/01/19/who-should-have-access-to-student-records

The U.S. Department of Education enforces FERPA.

Resources:

Data Mining: How Companies Now Know Everything About You        : http://www.time.com/time/magazine/article/0,9171,2058205,00.html#ixzz2172ZKahA

What is Data Mining? – YouTube
  ► 3:22► 3:22 http://www.youtube.com/watch?v=R-sGvh6tI04

Defining Privacy for Data Mining                                                                      http://cimic.rutgers.edu/~jsvaidya/pub-papers/ngdm-privacy.pdf

Dr. Wilda says this about that ©