Weekly Roundup Of Tech News – 05/23/2021
What: According to Axios KFF survey reported that more than half of Medicare beneficiaries utilized telephone for their Telehealth visits.
How: More than 56% of beneficiaries used telephone for the Telehealth. It was very high among hispanics (61%), rural (65%) compared to only 28% of people using video for telehealth.
Why it matters: Telemedicine conjures up video visits from the physician but this statistic provides an insight into the adoption by the end consumer. There could be challenges in the availability of broadband to the rural and minority communities. Until these challenges are addressed adoption of Telehealth will continue to be a challenge.
Artificial Intelligence: RAI’s certification to prevent AI turning into HALs
What: Responsible Artificial Intelligence Institute (RAI), a non-profit hopes to offer a more standardized means of certifying AI solutions.
How: RAI has built a concrete framework of Build, Accredit, Audit and Certify process that has dimensions in Accountability, Bias and Fairness, Data Quality, Explainability and Interoperability and Robustness for Certifying AI solutions.
Why it matters: We have seen how AI’s can go rogue through in the fictional Space Odyssey’s HAL computer which eliminates the entire crew. More recently, Microsoft’s Tay debacle, Facebook’s algorithm spreading online hate and the Clearview’s surveillance systems’ facial recognition software caused public outrage due to their power and the opaqueness of algorithms’ logic creating fear about AI itself. By certifying the AI systems similar to LEED, it gives transparency and more adoption.
Worldwide Web: Linkrot and its impact on the web
What: Research has shown that many important links in the web get lost to time. For e.g., quarter of The New York Times’ articles are now rotten, leading to completely inaccessible pages according to team of researchers from Harvard Law School. The following graph shows reverse view of link rot over time.
How: When an old article gets archived, the new location is not published. For example, let’s say an article was published in 1998 with a hard code the link and has been archived. The original link wouldn’t be active and someone else can publish a completely opposing view of the original content thereby affecting the integrity of the content. The study by Harvard Law School found that in 550,000 articles, which contained 2.2 million links to external websites in New York Times, 72% of them were “deep” or pointing to a specific page rather than a general website. 6% in 2018 vs 72% links from 1998 were dead.
Why it matters: Imagine a situation where the original video or content succumb to linkrot and in its place something else is published that could create confusion and panic. One solution is by Wikipedia where it asks for page’s archive on sites like Wayback machine. Another solution by Perma.cc project attempts to fix the issue of link rot in legal citations and academic journals by providing archived versions of the page along with original source. There are many other areas that require this capability and certainly something for a startup to think about. Any takers?