In this episode, I discuss the recent news about how Google believing it is possible for smartphone to interpret and read out loud sign language.
But is that a good thing? Do we need it?
As technology gets more and more sophisticated, we have seen gadgets that has literally changed our lives, and in many cases for the better. With the rise of technology such as smart speakers, we are becoming more demanding with what our gadget.
Remember how the remote control was seen as an innovative gadget? Now, it’s just something plain and normal.
But has technology gotten too far when it comes to using artificial technology to interpret sign languages? Or should we encourage it like we already do with oral languages?
You can listen to the podcast below or scroll down to read the transcript:
- Google’s blog about interpreting sign languages using artificial intelligence (Source: Google AI Blog)
- Software developer demonstrates communicates with Amazon Echo using sign language (Source: The Verge)
This is The Hear Me Out! [CC] Podcast, a place to hear stories from the d/Deaf and hard of hearing people and from your host Ahmed Khalifa.
We take technology for granted. We see all these things around us and we think it’s kind of normal now. This voice activated speakers and Alexa, Siri, Google Home, all these are so normal now. We got used to it. We don’t think it’s special anymore.
And I find that quite funny because it is an amazing piece of technology. I remember I used to joke in the past, and I think I’ve mentioned this in other episode, what I joked about how technically remote control is a technology, a piece of tech, a gadget and it’s making our life easier.
And we look at it and we think it’s just a simple boring technology now and we take it for granted what it can do and it’s normal. Whereas decades ago it was revolutionary for all of us.
And I find it quite funny when we look at where we are right now with technology and it’s getting more advanced and getting more into artificial intelligence and 3D and the only thing, it’s becoming more normal now to see it in our day to day life.
So I was very curious when I heard about this latest way for Google to translate sign language.
Basically, on their artificial intelligence blog they were updating the audience about their progress in their way to kind of translate sign languages, and I found that, “Wow, how did that work? how can you use a camera to understand the hand gesture?”
And that got me very interested and I read about it and the technology is amazing. I’m not going to go into too much details about it.
If you do want to read that post and see demonstration of it.
But I watched the demo that they have on the blog, I read about it and on the one hand I thought that is pretty clever. It is pretty sophisticated and I thought, “You know what? This can go somewhere, it can go far.”
And maybe it can help a certain situation when you want to overcome the communication barrier between a hearing and a d/Deaf person.
Maybe that’s what it’s going to be used for. I hope. But then I thought this is not going to be a good thing. The thing is with sign language, it’s not just about hand gestures, it’s also about other things.
It’s about your facial expression, it’s about your body language, it’s about the tone of, tone, I say by tone I say even the speed of the signing makes a different to what you’re trying to say and interpret.
So there’s no way that it can interpret that. And I was looking at it thinking, “That’s not going to work.”
But that’s the thing with sign language and that’s the beauty of sign language is that it’s complete in a sense that you use a number of different aspects of your body.
And when I watched this, I was mainly focusing on hand gestures only and who knows in the future are they going to start analysing your face expression, then your face muscles and work out whether you are angry or sad or happy or or mad or hungry?
Is hungry possible when you’re doing the facial expression? I don’t know. Maybe we’ll find out in the future.
But it just doesn’t work that way. And I just don’t know how I feel about this technology. I’m trying to work out when is this going to be useful for the world.
Maybe one way of doing it and maybe how it’s going to help is when you want to interpret the hand gestures into text, maybe that’s going to work. So from texts to audio or audio to text, and then use the hand gesture to interpret it into that medium.
Maybe that’s how it’s going to be. Maybe it’s going to be used for helping to make simple commands around the house and then connect it with your voice activated speaker.
Then I have seen technology that is happening right now where a student in MIT has got the basic of ASL, connected it with the Alexa speaker and it managed to understand the person signing in ASL, basic command. I’m talking like what time is it or what’s the weather? And I thought that was quite interesting as well.
So it was interesting for me to see that, although I’m not going use Alexa anyway.
The one thing that I’ve learned when I went to an accessibility conference and they were in a group discussion about technology and there were different people with different disabilities and have their own accessibility requirement.
And it was very interesting to learn from them about what technology they use and they depend on.
One of the obvious one was again, we’re talking about using the smart speaker, the voice activated speaker and I mentioned that for me I don’t really find it helpful.
And I know you can get one with the screen on, but in general I don’t really think I’m going to benefit from it because the idea that you don’t just sit there and look at the screen, you want to go about your day and it’s doing whatever you want it to do it in the background.
But then I noticed that there were other people who found it very, very useful and almost life changing for them as well.
For example, there were people on a wheelchair and they say that it helped them to do the thing that they will struggle to do because of mobility.
Simple thing that we take for granted, like switching the light on. If you connected it all together and you can get, use the voice activated speakers to say for example, “Alexa, turn on the light.”
Then it switched on the light and it makes it easier for those who are having mobility issues. And I find that very interesting. It made me realise that that’s a good point. And it’s the same thing with those who are blind as well.
They are able to use speaker to help them to access a lot of information that maybe they will struggle with if they’re using computer and the website they’re accessing is not very accessible.
But with the speaker they’re able to have accurate information and they can get it straight away. They get to listen for whatever they’re looking for. And that’s it. And again, I thought that’s a good point.
I never really thought about that and that’s the whole point about learning from other people is that you want to learn from their perspective, their experience. Because we will never understand if you don’t live in their shoes, if you don’t walk the same way as they walk in terms of their lifestyle.
So I find it very interesting in terms of technology can be very useful and for some people life changing in that respect.
And of course in the hearing, deaf world, then obviously there is the technology about the hearing aids available or these gadgets that are available, or the caption and device tools that are available and automated as well.
There are all these things, so many good technology out there that are useful. I’m using hearing aid, they are sophisticated. I remember when I was in 10, 20 years ago, they were terrible.
I used to hate wearing them because I really did find them, first of all uncomfortable, but second of all the quality of the audio and the sound, I just didn’t think it’s accurate at all. I thought it was terrible.
Why I Don’t Like Wearing Hearing Aids?
Even though they are more sophisticated than ever before, I still can’t bring myself to like wearing them.
But technology has come a long way since then and then it has evolved into other type of products available for other people.
Like for example, doorbell that has the flashing light so you can see it in another room. Or smoke alarm that also does the same thing, flashing light. Or an alarm clock that can also vibrate because then you can feel it if it’s beside your bed or under your pillow, but you can’t hear it.
These are the things where technology is useful. But I guess it is different when you’re looking at what Google is trying to do. It could be useful, but is it really a good thing?
Because languages, you can’t use robots all the time. You do miss certain thing that only humans can do. I believe. I speak a few languages and I’m learning BSL as well and it made me realise that it’s not just about what you are saying, it’s how you say it.
And it’s also about giving you access to the community and the culture and the food and the country and so many different things. And that’s the whole point of languages.
And I just saw that article and I just can’t see how that’s going to help you have access to those part of language learning. And at the end of the day, don’t forget that there are hundreds of sign languages as well, so how is that going to work?
It’s going to start understanding all the sign languages? And then if you for example, if it’s going to focus on ASL but then there’s still regional definitions of certain signs.
And that will need to be taken into account when you’re going to a specific country or city or town. They will have their own definition of a particular word in sign language.
How is that going to be incorporated? I don’t know. Maybe they have that in the pipeline, maybe they’re thinking about that.
But I just don’t know how I feel whether it is a good thing, is it going to replace sign language? I don’t think it is. It will never replace sign language.
Is it going to replace something else that I think is more important? Which is the human interaction. And that’s the argument we see more and more now about social media.
It’s removing the essential element of communication and relationship building when it’s all done online. Or even when it’s face to face you still looking at your phone when you are with other people.
And that’s obviously removing the element of the social aspect of being with other people. Is sign language interpretation using the artificial intelligence, is it going to do that as well?
You can argue that you know what, we have translation for other languages and I’m talking oral languages as well, so if you are translate from English to Spanish and French and German, you have that technology available and it’s useful of course.
When you go abroad and you want to find out the meaning of certain things you buy in a supermarket, of course it’s useful.
But you’re not going to use that to completely take over the social side of conversation and getting involved with people and socialising with them. It will never replace that, I think in my opinion. It just, you lose that element of relationship and communication.
So in my opinion, I think that’s going to be like that for this new technology that Google is trying to build and evolve. So it will be interesting to see how it go from here.
And I am wondering what do you think about it? Do you think it’s a good thing to have technology that can interpret sign language?
But it’s only the hand gestures, it’s not able to recognise facial expression or your body language or your speed, anything like that.
And then the local dialects, not able to recognise that yet. But does that matter to you? Do you think it’d be useful for you if you were either learning or you don’t want to learn but you want to know the definitions? Do you think it’s useful?
Let me know and you can reach me, all my details are in the show notes. You can check me out there and I would just love to know what other people think about it. Because I am not sure about this yet.
I’m impressed with this technology, I’m impressed that it can do this. And if you check out the show note you can see the demonstration. It is impressive. But is it a good thing in the long-term?
I’m not sure. Let me know what you think. I’d love to hear from you and I would also appreciate it if you can leave a review of whatever platform that your using, whether Apple, or Spotify or any other platforms.
In the meantime, I will speak to you again soon.
- “Which sign language should I learn first?” – My thoughts - May 5, 2021
- “There should be a universal sign language…” – My thoughts - April 27, 2021
- How to make the most out of your audiology appointments & your audiologist? - November 12, 2020