Very impressive. Reminds me of Be My Eyes, which uses humans to help visually impaired see. Although less "scaleable" the latter may help elicit empathy and bring more awareness to those without eyesight disabilities.
@rrhoover very true - but could be even more awesome in combination! :D
AI as first line for easy tasks and Community as fallback for the harder ones..
As one of the makers of @bemyeyes I hope the new management listens in..!
It's interesting to think what this technology could do for other industries too. For example, if Shyp could identify what you're sending and give you a better estimate on the spot.
@lindsayclamb great idea! Also implement a way when pointing your device at the object you're shipping it can calculate width & height to get a better estimate as well.
This is a fantastic idea, I have a few friends with sight difficulties that would love to try this app. I noticed the app itself is quite large - 134MB, can you explain why this is?
@rohit0130 Thank you! The app is large since it's hosting the entire neural network on the phone, this is what allows us to run all the image recognition offline. The net has been trained with > 10 million images, so 134MB is actually not that big :)
This is pretty great! I really like how technology is changing peoples lives!
I've also written a tiny piece over at App Recap about it :) http://apprecap.net/aipoly-augme...
This "live lens" on a mobile phone feels like a game changer. Would be interesting if it could tell you *who* (John, Jane, Sam, etc) or exactly *what* (Ranunculus, a spleen, etc) you're looking at, too. Computer vision is gonna make a huge impact on every industry (education, tracking, security, inventory, etc). Exciting times ahead.
Bravo!
I downloaded and tested it out. It seems it is very v1. It can recognize basic things like cup, a cup of water, headphones, and laptop. However it had trouble recognizing watch, handbag, and scarf.
It is very fast and I can see it improve the more it is used.
I'd love it to have a wearable - smartwatch component too :)
Very cool, although I'm kind of surprised at this implementation. A day before this was featured on PH, I wrote an opinion piece on Medium about AR claiming, in part, that the demands inherent in AR recognition tasks dictate that it would have to be done off-device. (That piece is here: https://medium.com/@doctorhandsh...) ... I wonder if you have opinions just yet on how far this implementation will scale and/or if you think parts of this task will move off-device at a certain point?
Replies
Product Hunt
Indistractable
Men Mojis
Retouch
V7 Go
Pollpass
Summer Bod 2020
Text Case 2.0
Hyper Online
Strikingly
Zengo wallet
BlueLight on Apple Watch
V7 Go
Waterpebble
tl;dr Marketing
it looks and acts very probising and impressive
i signed up to the trial
however, it is not perfect, and there can be mistakes as well as it is not recognize much languages and currecys
Pros:help disables to detect colors, USD, objects and more. also reads text in 7 languages
Cons:not allway accurate, needs development
ZEPETO