NEW BRUNSWICK, NJ—Verificient Technologies, the company behind the student-monitoring, anti-cheating software ProctorTrack, has not communicated to Rutgers students what the company has done with their personal data. 

As we reported, ProctorTrack uses remote-monitoring technology to collect audio, video, and document the web activity of students as they take the exam. The software also scans the ID, face and knuckles of the student, and takes a voice sample. 

The Rutgers-Verificient contract, which was signed by Rutgers on August 25 and released to New Brunswick Today the same day, details the responsibilities the company has for the storage and disposal of student data. 

A section on page four of the contract, titled “student data purge and retention,” says that a student’s personal data would be held for a 90 days after either the last day of class, or the date the final exam is taken. 

At this point, the student would receive an”automatic generated” email from the company assuring that the student data collected by ProctorTrack will be “purged and deleted.”

Any back-up data stored by the servers would be deleted within 30 additional days, memorialized by a second email being sent to the student within 15 days indicating a “complete deletion of student records.” 

But complaints from students suggest that Verificient has not sent out any notification about the status of their data.

Several New Brunswick Today readers contacted this reporter to say they had not received any email about whether their data was purged, deleted, stored or distributed.

A joint statement released by Rutgers and Verificient in February outlined the company’s promise that the ProctorTrack servers would delete any student data within 30-60 days of the final exam of the course.

The company also posted a privacy pledge on their website, continuing with their promise that student data would be purged and deleted within 30-60 days of the final exam. 

But the company’s privacy policy at the time of the blog post stated that it could unilaterally amend its policies at any time, and that student data could be disclosed to third party service providers or in the event of a bankruptcy or company merger.

But the contract singed in August provided for a longer time frame to delete data, and notify students: 90 days.

According to the contract, which actually went into effect seven months before it was signed, students who used the software during the spring 2015 semester should have received email notifications that their proctoring data had been permanently deleted from the servers.

Officials at Rutgers and Verificient have not responded to inquries about whether or not the company is in compliance.

ProctorTrack was hastily introduced to Rutgers students at the beginning of the Spring 2015 semester.  The technology had only received a patent been the federal government on January 6, just weeks before the software was rolled out to thousands of students.

According to Rutgers officials, a “verbal agreement” was in place between Rutgers and Verificient.

During the seven months under the verbal agreement, confusion and miscommunications arose over the price of the software and its slipshod introduction to students, as well as how long the private company would retain student’s personal data, and whether students had to use it or if they could use alternative proctoring methods.

Students were also frustrated about the $32 fee to use the software, and the perceived lack of notification, which resulted in many students finding out about the new software after it was too late to drop their courses.

Rutgers student Betsy Chao launched a petition in February calling for an end to Rutgers’ use of the software. 

University officials, along with Verificient CEO Tim Dutta, maintained that the introduction of the software in the beginning of the spring 2015 semester went flawlessly.

Rutgers officials also defended the rollout, and the notifications made to students about the changes to online courses.

“The ProctorTrack notice appeared from the first day of class in the Course Home and the first assignment on the first day of class was to read these documents,” Rutgers spokesperson EJ Miranda wrote in February.  “The notice included the pricing.”

“All students had the regular drop/add period plus the two-day extension to drop the course if they wished to do so.”

But months later, University President Bob Barchi admitted in an interview with the Daily Targum that Rutgers might not have done enough to make sure students were properly notified about the software. 

“I think what maybe wasn’t done quite right here was to make sure that everybody who is taking that course knew that was coming,” Barchi told the paper.

A New York Times published article in April shed light on a new angle of the controversy.

Jeffrey Alan Johnson, assistant director of institutional effectiveness and planning at Utah Valley University compared ProctorTrack to the controversial Transportation Security Administration (TSA) passenger screening program known as SPOT (Screening Passengers by Observation Techniques).

Indeed, the creator of ProctorTrack touted his experience working with the TSA to develop technology similar to ProctorTrack.

The SPOT program employs specially-trained TSA officials, known as “behavior detection officers,” to observe, watch and interact with crowds as they pass through security checkpoints, seeking out individuals displaying suspicious behaviors that could mark them as potential terrorists.

Some of these behaviors include exaggerated yawning, looking down or avoiding eye contact with security personnel, sweaty palms, improper attire and whether the individual appears to be confused or disoriented. 

Reporter at New Brunswick Today

Award-winning, multimedia journalist with experience in digital first and print-media. Daniel has covered local, state and regional issues, and utilized photography, social media and has written in-depth articles to produce high-quality work.

Award-winning, multimedia journalist with experience in digital first and print-media. Daniel has covered local, state and regional issues, and utilized photography, social media and has written in-depth articles to produce high-quality work.