#ifihadglass I’d write an app to help colorblind people by using textures on surfaces to distinguish colors.
If I had glass, I’d want to see an app that helps color blind people, like me, with everyday struggles. Cooking can be dangerous when you can’t tell color changes between rare and raw meat, Fashion skills are downright embarrassing, & driving is much more of a challenge.
How do you design websites or airport security signs for colorblind people? More than 10% of people are #colorblind. #ifihadglass, I’d write a #freeapp that 1) highlights and labels confusing colors in realtime for colorblind people and 2) lets anyone experience and understand #colorblindness firsthand.
#ifihadglass I would appreciate my first sunset. Colorblindness affects many of us. Overlaying a basic color palette interpreter (currently clunky and awkward holding your phone camera up) would let me see what I’m about to wear getting dressed. Is that warning light blinking yellow or red? Dream with me…
My Dad is a colorblind artist, so I’m with you. Would be interesting to explore giving you an alternate spectrum of color that you actually could see. No blinking lights here, how about simple full color according to your palette that gives you better alignment and understanding with what’s considered normal…
Felipe. I appreciate your interest, and thank you for not asking what color your shirt is. I’m sure it would technically be interesting to see this way, but just a novelty. Oh look… sky. Oh look over there.. more sky..? But I can take a picture with my phone and run one of many apps that can identify colors and give me a best guess color code. Maybe not the most elegant, but at least I’m be able to see more than just “sky”. If that could be integrated into google glass… Wow.
Maybe for those “red-green” colorblind people (one of the most common), you can have an auto filter that just shifts the colors a few spectrums in real time. Boom. No more differentiating red from green. Like web safe colors, but for CB idiots like myself.
I’d call it “Suddenly Sunset”.
Maybe completely unrealistic, but if it was possible?
#ifihadglass – Being color blind (or better describe as color deceived) has its challenges. It would be great to develop an application for Glass that would perform color confirmation. By placing crosshairs on an object the Glass could determine the closest primary color by averaging the area within the square.
#ifihadglass I would create an application to assist color blind people (like myself). The application would receive camera input, re-mapped certain colors to avoid the conflicts in common color blindness, and display the result on the Heads-up-display.
I wrote a brief description of an app for a Google Glass competition (link below). The idea is to help colorblind people identify colors by selecting regions where color would normally be confusing to a colorblind person and then augment it in real time. The augmentation could be text labels (is it purple or blue), increases in contrast or saturation (show it as definitely purple or blue), or something else (?). You could also use it in reverse if you weren’t color blind and wanted to see how something would look to a colorblind person (product design perhaps).
I do actually want to write this app for myself as I am colorblind and open source it. I have plenty of experience designing and writing mobile apps, but I’ve never considered UI/UX for a colorblindness app. I’ve tried several of the simulated colorblindness apps/websites over the years, but none of them actually seem to work very well (things that should look the same don’t). Anyone have any experiences building or using similar apps? Any thoughts or suggestions will help and are welcome, though I’m particularly interested in ideas around what would make a good user experience and what would help with accuracy.
Sometimes when I’m not sure about the color of something, I take a picture and then apply random filters to it to try to bring the color out. Maybe you could have a couple of default filters for users to toggle through. This sounds like a great idea, I hope you make this app!
Both! I’ve found that simulated apps don’t really do a good job, and I’m sure there’s a better way. I’ve worked with a number of designers that have trouble understanding colorblindness even after trying existing simulation apps.
Really this could be two separate apps, but I think the underlying model (correctly mapping one color space to another on a particular set of hardware) could be shared.
Realtime simulation glasses would be fantastic. All current solutions I know are either expensive (well, google glasses are too I guess) or slow or require interaction.
edit: Keep in mind that most simulations are for total blindness of a color. So for most affected people it will be extreme as minor forms of deficiencies are much more common.
edit2: and i delete the question seconds after you replied when I realised my mistake. Duh! :D
Incoming google glass search terms:
- primary color app glass