Media Library (5)

The Case for Right to Repair Has Not Been Made

Chris Mohr, President, SIIA; Paul Lekas, SVP, Global Public Policy & Government Affairs, SIIA

Today, the House Judiciary Committee’s IP Subcommittee held a hearing on “right to repair” – the concept that purchasers of vehicles, electronic devices, and other products should have the ability to repair and make changes to those products without invalidating manufacturer warranties and in some cases by requiring manufacturers to make data and tools available to consumers and repair shops.

Our members are in the business of information, and much of that information is sold as a service through many different devices and different ecosystems. Those ecosystems are protected by intellectual property statutes that prevent unauthorized and harmful uses of those platforms. For example, section 1201 of the Digital Millennium Copyright Act (DMCA) has spawned a huge investment in the distribution of software and other literary works online by preventing businesses from forming around piracy tools. As information has become increasingly sold as a service, changes in technology have changed the relationship between consumers and their devices and a digital market for all kinds of works has exploded—exactly as Congress intended.

Against the tremendous success of the DMCA in promoting the dissemination of online works, we are skeptical of the need for any additional legislation around “right to repair.”  The case for it has simply not been made. We note that when the FTC examined this issue, it found that “the assertion of IP rights does not appear to be a significant impediment to independent repair.”  Even in the most famous example, only 2% of tractor repairs run into an IP-based problem.  And both auto and cell phone manufacturers have entered MOUs that deal with these issues. While individual anecdotes may seem compelling and have convinced some state legislatures, the policy case for risking the backbone of a massively successful ecosystem non-existent.

Importantly, piracy risks are not the only ones implicated by the right to repair movement. When Massachusetts voters approved a ballot initiative to require automobile manufacturers to provide open remote access to vehicle “telematics” data to customers and repair shops, the National Highway Traffic Safety Administration (NHTSA) felt the safety and security concerns of the law were sufficiently significant that it contacted automotive manufacturers to urge them not to comply. As NHTSA explained, the open access required by the Massachusetts law would “allow[] for the manipulation of systems on a vehicle, including safety-critical functions such as steering, acceleration, or braking, as well as … air bags and electronic stability control.” NHTSA expressed concern that “[a] malicious actor here or abroad could utilize such open access to remotely command vehicles to operate dangerously, including attaching multiple vehicles concurrently. Vehicle crashes, injuries, or deaths are foreseeable outcomes of such as situation.”

The kinds of safety and security concerns raised by NHTSA are not limited to the automotive sector. The business of information can thrive only if its users believe that their data is being handled well.  Indeed, ill-designed right-to-repair legislation could have a significant impact on the safety of personal data in consumer electronic devices and undermine critical cybersecurity protections built into the software and hardware of consumer devices. What too many of the right to repair proposals demonstrate is a lack of awareness of how opening access to manufactured products will create exposure that undermines cybersecurity and privacy.

Finally, some have homed in on right to repair as part of a larger effort around competition.   As this argument goes, restrictions on third-party repair serve only to entrench the manufacturers. This position ignores the essential investment that manufacturers have made to ensure the safety and security of data used in their products. Part of what makes products valuable to users is how those products ensure the safety and security of their data and come equipped with software that mitigates the risk of a cybersecurity breach.  Existing competition law is more than capable of dealing with the minority of situations in which this claimed “right” is implicated.

Media Library (4)

North Carolina Implements Stringent Data Security Standards for Third-Party Vendors Handling Student Information

For much of the past decade, states have been implementing new laws and policies to protect the privacy and safety of student data.  Most recently, the state of North Carolina’s Department of Public Instruction (NCDPI) launched new data security standards for any technology or system that receives student information from a state system.  Going into effect on August 1, 2023, this statewide policy design and intent is to ensure that public school units (PSU) have the resources they need to adequately evaluate the security readiness of vendor partners. In an effort to prevent cybersecurity threats in ed tech platforms and tools, NCDPI implemented a new process that impacts third-party vendors at a PSU.  In short, third-party vendors will be required do the following: 

  • Sign the DPI Data Confidentiality and Security Agreement, with no modifications.
  • Articulate which statewide systems they will connect to, data fields requested and rational for collection, and how that data will be restricted to users who have a legitimate business need, and a description of any data written back to a statewide system.
  • Submit security documentation including a vendor readiness assessment report, a third-party conducted assessment report (FedRAMP authorization, ISO 27001 certification, or others) no less than 12 months old, and alignment against the NC DIT Statewide Information Security Manual.
  • Provide additional documentation if not in compliance with the Statewide Information Security Manual. 

Third-party vendors that are contracted or renewed after August 1, 2023 will have to be evaluated with the aforementioned steps, before it can be integrated in the PSU. Vendors that do not comply with the security requirements for integration will not be allowed to receive student data from the PSU.  

SIIA raised concerns with the new policy and requested additional guidance via a letter on  June 20, 2023. We received a response on July 12, 2023 with direct answers to our questions, to which SIIA responded via another letter on July 25, 2023. We are posting these answers for the broader public in case they are of assistance. Further, SIIA participated in a public meeting/call with NCDPI on June 22, 2023, however, there is no recording of that call.  There is still much confusion on this and we look forward to working with our members and the state of North Carolina to make sure student data is protected.  

If you have any additional questions, please contact our education policy team at education@siia.net.

North Carolina Implements Stricter Data Security Standards for Student Information: What Third-Party Vendors Need to Know

For much of the past decade, states have been implementing new laws and policies to protect the privacy and safety of student data.  Most recently, the state of North Carolina’s Department of Public Instruction (NCDPI) launched new data security standards for any technology or system that receives student information from a state system.  Going into effect on August 1, 2023, this statewide policy design and intent is to ensure that public school units (PSU) have the resources they need to adequately evaluate the security readiness of vendor partners. In an effort to prevent cybersecurity threats in ed tech platforms and tools, NCDPI implemented a new process that impacts third-party vendors at a PSU.  In short, third-party vendors will be required do the following: 

  • Sign the DPI Data Confidentiality and Security Agreement, with no modifications.
  • Articulate which statewide systems they will connect to, data fields requested and rational for collection, and how that data will be restricted to users who have a legitimate business need, and a description of any data written back to a statewide system.
  • Submit security documentation including a vendor readiness assessment report, a third-party conducted assessment report (FedRAMP authorization, ISO 27001 certification, or others) no less than 12 months old, and alignment against the NC DIT Statewide Information Security Manual.
  • Provide additional documentation if not in compliance with the Statewide Information Security Manual. 

Third-party vendors that are contracted or renewed after August 1, 2023 will have to be evaluated with the aforementioned steps, before it can be integrated in the PSU. Vendors that do not comply with the security requirements for integration will not be allowed to receive student data from the PSU.  

SIIA raised concerns with the new policy and requested additional guidance via a letter on  Jun 20, 2023. We received a response on Jul 12, 2023 with direct answers to our questions. We are posting these answers for the broader public in case they are of assistance. Further, SIIA participated in a public meeting/call with NCDPI on Jun 22, 2023, however, there is no recording of that call.  There is still much confusion on this and we look forward to working with our members and the state of North Carolina to make sure student data is protected.  

If you have any additional questions, please contact our education policy team at education@siia.net.

 

To view the letter and response, please click here.  

 

Media Library (3)

SIIA Urges CFPB to Protect Data Brokers and First Amendment Rights

The Software & Information Industry Association (SIIA) has expressed concerns to the Consumer Financial Protection Bureau (CFPB) regarding regulations on “data brokers.” SIIA, representing over 450 companies in the information business, cautions against regulations that hinder productive data use or infringe on First Amendment rights. They argue that data brokers play a crucial role in providing information for various purposes such as law enforcement, combating money laundering and terrorism, child support enforcement, insurance, product safety, tax compliance, fraud prevention, and news publishing. These activities rely on publicly-available data and contribute to consumer security and convenience. SIIA asserts that there is a legislative and regulatory consensus that data brokering is protected by the First Amendment. They also argue that restrictions on publicly-available data or its derivatives would violate the First Amendment. While regulations on private data may be appropriate, they emphasize the importance of protecting the public domain and maintaining the availability of information for positive societal endeavors.

‘This AI Is Something We Can Interact With Easily’; AI Snapshots From the Content World

Jim VandeHei, co-founder of Axios, recently wrote that the improved search results people get with ChatGPT-like technologies will “force publishers to tighten our direct relationship with you, the customer/consumer.” He also predicts that “newsletters will rise in importance, as Microsoft and Google make emailing magically easy by helping you write, answer and sort emails.”

Engineers are creating AI reporters that can participate in and cover meetings—though still a bit “hallucinatory”—illustrate them like Monet, and write sponsored content. But that type of transformation, according to VandeHei, should make publishers only want to be more trustful.

“AI will rain a hellfire of fake and doctored content on the world, starting now,” VandeHei wrote. “That’ll push readers to seek safer and trusted sources of news—directly instead of through the side door of social media.”

That’s interesting because the Reuters Digital News Report that I wrote about earlier in the week talks about all the traffic now going through those side doors.

VandeHei also predicts that “advertisers will shift to safer, well-lit spaces, creating a healthy incentive for some publishers to get rid of the litter you see on their sites today. That shift is already happening.”

“Everything we’re seeing right now is a progression over time,” said Dray McFarlane of Tasio at our recent AMPLIFY summit. “The difference is this AI is something we can interact with easily—ChatGPT gets people involved.”

Here are 6 AI snapshots from the content frontlines:

Create AI reporters. Mark Talkington, publisher of The Palm Springs Post and another Coachella Valley publication, has been developing an artificial reporter named Paul, writes Sophie Culpepper in NiemanLab. “It’s just software that works in the background and listens in and acts as a reporter,” Talkington said. “It can sit there and it can draw pictures that look like Monet based on…what people are saying, but we don’t need any of that. All we need is a short summary of what happened during this [local town] meeting.”

Summarize meetings.Talkington’s friend, former Microsoft VP Peter Loforte, also created an AI, nicknamed Maria, who can participate in and accurately summarize meetings, Culpepper reported. “Both Maria and Paul ‘are built using a collection of publicly available models and AIs that I fine-tune, chain together, and customize in novel ways leveraging Python code,’ Loforte said. Interestingly, during the development, he found inspiration in the adaptability of non Gamstop casinos UK, which operate outside traditional regulations yet manage to thrive by leveraging innovative technology. He has mostly relied on the library LangChain, ‘which enables developers to build all sorts of LLM-based solutions leveraging any number of models… I’ve never seen progress move so quickly. It is the Wild West right now.’”

Generate messaging. ChatGPT can help generate safety messaging content to be distributed to the community or help to educate community members, wrote Rachel Engel of Lexipol. Consider how the chatbot can assist with preparations for certain events, such as our daily EMS Week themes, by creating digestible information in lay terms that can be customized by age. For example: “Explain to a 5-year-old what paramedics do.”

Send an automated newsletter. Scott Brodbeck, founder of Virginia-based media company Local News Now, began experimenting with a completely automated weekday morning newsletter comprising an AI-written introduction and summaries of human-written stories, Culpepper wrote. Using tools like Zapier, Airtable, and RSS, ARLnow can create and send the newsletter without any human intervention. Next he wants to do a daily update on YouTube and is “experimenting with using AI to look for typos and other errors in newly published articles; categorize articles into positive, neutral and negative buckets for potential social media purposes; and drive a chatbot to help clients write sponsored articles.”

Experiment. In a TV interview at the end of last year, Brodbeck said that they “were having fun with AI.” They had it write stories on a peppermint mocha shortage and paranormal activity at a county civic meeting—and “sing the praises of a potential Rosslyn to Georgetown gondola. It liked that. Then we asked it to talk nicely about Arlington in general and the AI came up on its own with the Air Force Memorial, Tomb of the Unknowns and the Iwo Jima Memorial.”

Work with transcripts. “You can take a transcript, or maybe even a series of minutes that someone’s typed up and reformat them, putting them into bulleted lists—adding Markdown formatting,” Joe Amditis, assistant director for products and events at the Center for Cooperative Media at Montclair State (NJ) University, told me. “If you have a set of instructions on how you’d like to format something, you can set up those instructions through Zapier. The bot will take the text—whether through a Google form, or even a Slack message with a certain tag or in a certain channel. It’ll make decisions on how to label things, and once you iterate and tweak that process and give it a good run through to make sure that it’s consistent.”