There are a few interesting hacks of people experimenting with feedback between Spatial Computing devices and vision impaired use cases. A case in point is this Unity test using audio sonar to "create a 'noise field' around the listener that directly communicates surface distances"
Meta has Project Aria for research, https://news.ycombinator.com/item?id=43066927, but no public SDK. Devices dedicated to vision-impaired users are north of $2000. Hopefully Envision Companion (https://www.letsenvision.com/companion) software can be ported to $300 Meta smart glasses or the growing list of open glass hardware that is compatible with OSS AugmentOS.
> Smart glasses OS, with dozens of built-in apps. Users get AI assistant, notifications, translation, screen mirror, captions, and more. Devs get to write 1 app that runs on any pair of smart glases.
Where "any" means this HCL:
Vuzix Z100 ($500)
Mentra.glass Mach1 ($350) or Live ($220)
Even Realities G1 ($600)
(future) Meizu StarV
Apple iPhones have state-of-the-art hardware (lidar, UWB precise positioning) that could help millions of visually impaired humans, but iPhone hardware has been limited by amberware (software frozen with minimal updates). Apple poured billions into now-cancelled money pits like Apple Car and VisionOS, while teams failed forward into AI, smart glasses and humanoid robots. Meanwhile, Meta smart glasses are S-curving from 2M to 10M nodes of data acquisition, https://news.ycombinator.com/item?id=43088369
On paper, Apple Magnifier with Live Descriptions audio could justify the purchase of an iPhone Pro for dedicated single-app usage, https://support.apple.com/guide/iphone/live-descriptions-vis.... But while it works for short demos, the software is not reliable for continuous use. UWB AirTags and Lidar 3D imaging could enable precise indoor navigation for vision impaired users, but 5+ years of shipping hardware has not lead to usable software workflows.
The economic tragedy is that 95% of the technology for helping vision/cognition impaired humans could be repurposed for humanoid robots, with R&D bonus that humans can provide more real-world feedback (RLHF!) than silent lab robots. Until Apple breaks vision stasis with new technical leadership, or Apple/Meta are regulated by the EU to unlock reluctant open-glass-platform innovation, the only hackable option is open glass hardware for future BigTech sherlocking or acquisition.
> If I don't want their products I don't have to buy them.
The U.S. Federal Trade Commission (FTC) has helpful communication with vendors about promises made to consumers. Customer feedback on commercially advertised features is usually considered valuable market research.
> The Ray-Ban Meta Smart Glasses are entering a new market, as the company announces a new collaboration with accessibility service provider Be My Eyes that will help wearers get the assistance they need.
> Apple today previewed software features for cognitive, vision, hearing, and mobility accessibility, along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. These updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone.
Our DNA is as a consumer company - for that individual customer who's voting thumbs up or thumbs down. That's who we think about. And we think that our job is to take responsibility for the complete user experience. And if it's not up to par, it's our fault, plain and simply.
> If I don't want their products I don't have to buy them.
You see thats why there is so much negative sentiment toward these glasses.
Meta, in their insatiable hunger for data, is trying to convince people to pay to record my life.
I have chosen not to use anything Meta, I will not purchase these abominations, yet I still have to deal with morons who pay to do their data collection.
> Disparaging these companies for making products people want while writing angry posts that these companies aren't doing what you want.
I'm not seeing the "angry posts" you are referencing here. Neither the HN post you're replying to nor the linked post come off as angry to me. Disappointed, perhaps.
And that aside... NOT disparaging these companies certainly isn't getting them to do the right thing. Informing others who might not otherwise be in a situation where they are aware of these deficiencies seems to me the most likely way to exert subtle pressure to get these companies to do the right thing eventually.
There are a few interesting hacks of people experimenting with feedback between Spatial Computing devices and vision impaired use cases. A case in point is this Unity test using audio sonar to "create a 'noise field' around the listener that directly communicates surface distances"
https://x.com/TheMirzaBeig/status/1823489371279937885
https://m.youtube.com/@MirzaBeig/videos
HN ranking history for this thread, https://hnrankings.info/43067002/
Meta has Project Aria for research, https://news.ycombinator.com/item?id=43066927, but no public SDK. Devices dedicated to vision-impaired users are north of $2000. Hopefully Envision Companion (https://www.letsenvision.com/companion) software can be ported to $300 Meta smart glasses or the growing list of open glass hardware that is compatible with OSS AugmentOS.
This thread lead me to Mentra's MIT-licensed https://github.com/AugmentOS-Community/AugmentOS & https://augmentos.org/ for Android and (soon) iOS:
> Smart glasses OS, with dozens of built-in apps. Users get AI assistant, notifications, translation, screen mirror, captions, and more. Devs get to write 1 app that runs on any pair of smart glases.
Where "any" means this HCL:
Apple iPhones have state-of-the-art hardware (lidar, UWB precise positioning) that could help millions of visually impaired humans, but iPhone hardware has been limited by amberware (software frozen with minimal updates). Apple poured billions into now-cancelled money pits like Apple Car and VisionOS, while teams failed forward into AI, smart glasses and humanoid robots. Meanwhile, Meta smart glasses are S-curving from 2M to 10M nodes of data acquisition, https://news.ycombinator.com/item?id=43088369On paper, Apple Magnifier with Live Descriptions audio could justify the purchase of an iPhone Pro for dedicated single-app usage, https://support.apple.com/guide/iphone/live-descriptions-vis.... But while it works for short demos, the software is not reliable for continuous use. UWB AirTags and Lidar 3D imaging could enable precise indoor navigation for vision impaired users, but 5+ years of shipping hardware has not lead to usable software workflows.
The economic tragedy is that 95% of the technology for helping vision/cognition impaired humans could be repurposed for humanoid robots, with R&D bonus that humans can provide more real-world feedback (RLHF!) than silent lab robots. Until Apple breaks vision stasis with new technical leadership, or Apple/Meta are regulated by the EU to unlock reluctant open-glass-platform innovation, the only hackable option is open glass hardware for future BigTech sherlocking or acquisition.
Disparaging these companies for making products people want while writing angry posts that these companies aren't doing what you want.
Apple and Meta should use their amazing technology for accessibility. And maybe they are or will. But they owe me nothing.
If I don't want their products I don't have to buy them. If I don't want their stock I don't have to invest in it.
Dear unsupp0rted,
> If I don't want their products I don't have to buy them.
The U.S. Federal Trade Commission (FTC) has helpful communication with vendors about promises made to consumers. Customer feedback on commercially advertised features is usually considered valuable market research.
Sep 2024, https://mashable.com/article/meta-rayban-smart-glasses-be-my...
> The Ray-Ban Meta Smart Glasses are entering a new market, as the company announces a new collaboration with accessibility service provider Be My Eyes that will help wearers get the assistance they need.
May 2023, https://www.apple.com/newsroom/2023/05/apple-previews-live-s...
> Apple today previewed software features for cognitive, vision, hearing, and mobility accessibility, along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. These updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone.
Mar 2008, Steve Jobs, Apple co-founder and CEO, https://allaboutstevejobs.com/verbatim/interviews/fortune_20...
> If I don't want their products I don't have to buy them.
You see thats why there is so much negative sentiment toward these glasses.
Meta, in their insatiable hunger for data, is trying to convince people to pay to record my life.
I have chosen not to use anything Meta, I will not purchase these abominations, yet I still have to deal with morons who pay to do their data collection.
> Disparaging these companies for making products people want while writing angry posts that these companies aren't doing what you want.
I'm not seeing the "angry posts" you are referencing here. Neither the HN post you're replying to nor the linked post come off as angry to me. Disappointed, perhaps.
And that aside... NOT disparaging these companies certainly isn't getting them to do the right thing. Informing others who might not otherwise be in a situation where they are aware of these deficiencies seems to me the most likely way to exert subtle pressure to get these companies to do the right thing eventually.