Skip to end of metadata
Go to start of metadata

Title

The smart chip augmented experience

Design Team

Suresh

Peer review Team:

Ayman

The smart chip augmented experience doc

  • Comments added by Steve

Overview

The use of conventional devices limits the user mobility and pins the people with special needs to their homes.  This is very problematic especially when these special needs people including blind, deaf, aged, wheel-chair bound are confined to this space for their lifetime. The Smart Chip is poised to bring remarkable changes in the way people live, especially the special needs person. The aim behind this smart-chip is to create an augmented world for these people that is tailored to their individualistic needs in a variety of common settings.  Technological advancements have led to the development of high-precision biosensors, which is key aspect in generating and collecting data that is precisely accurate.   Further, they have led to ‘smart’ technologies such as smartphones, smart watches, smart glass and smart tablets and have opened an entirely new path for these special needs people in fulfilling ordinary tasks in a much more flexible and remote manner, greatly reducing dependency on others and improving their life.  The promisingness behind these smart devices, technologies, concepts, regardless it be hands-on touch-screen or hands-free voice control and imaging capabilities through the built-in Wi-Fi, MiFi or Bluetooth functions, in conjunction with the integrated smart chip will impede mobility and independency and will enable a new augmented world for these special needs people. 

THE SMART CHIP

Synopsis of the Smart Chip

The smart chip encompasses various components including sensors, transmitters, GPS, and processors. The sensor within this smart-chip is highly networked and scalable as well as smart and software programmable.  The custom-built smart chip circuit board is integrated into an embedded Linux board, which contains high number of pins (more than 100 in total) that enable multiple sensors and controllers connection for any type of hardware.   The board is programmable and compatible with any program languages.  The data is transmitted from the Linux board to the Cloud App Engine, which uses some application programming interface (API) with simple http post and http get operations. The smart-chip supports all of the smart accessible devices mentioned below.  The smart chip is partnered with local outlets and service providers.  The result of the smart chip is ultimately empowered augmented-reality with the benefits of added convenience, time, cost saving, increased safety, better self-reliance and independent living.  Additionally, smart chip is adept at quick data acquisition and needs very little maintenance.  It is highly reliable and dependable.    This smart-chip supports all of the smart accessible devices listed below.  However, the emphasis is placed solely on AR Headphone for the reasons of maintaining more "precision and focus" on this project as well as for the reason AR Headphone can be used by various community members.  Figure 1 below shows AR Devices, including AR Headphone, communicating with Cloud App Engine and Partnered Outlets.

AR Gloves with standard headphone (wired or remote); input control could be either voice command or touch sensor, the output is voiced back.

AR Headphone: voice command needed for input control and a physical touch of the button is needed to authenticate and activate the device on initial wear, the output is automatically voiced back.

AR Glass with standard headphone (wired or remote); input data via hand-held remote or voice command, the output is voiced back or visual display or Robotic sign-language

AR Watch with standard headphone (wired or remote); input is voice command or keypad or touch sensor, output is visual display or Robotic sign-language display or voiced back

 

 

                                                     Figure 1 - Cloud Communication Environment

Architecture

The custom built smart chip design can be viewed in “Figure 2-The Smart Chip Design” and is discussed in greater details below.                                         

  • Embedded Processor: Its main function in the smart chip is to schedule tasks, to process data and to control the other hardware component’s functionality.
  • Transceiver: Is responsible for wireless communication, which is RF based, of the smart chip.  Operational states are transmit, receive, idle and sleep.
  • Memory: There are two types of memory in the smart chip.  The first is Program Memory from which instructions are executed by the processor. The second is Data Memory which stores raw and processed information. Memory and storage in flash memory and external flash memory and RAM of microcontroller are sufficient.
  • Sensors: The smart chip has multiple sensors on board and the number of sensors vary according to application requirement. The specific type of sensors are deployed in accordance with application requirements and they includes light, pressure, temperature, humidity, low-resolution imagers and accelerometers. 
  • Geo-Positioning System (GPS): The GPS in the smart chip is responsible navigation.   It obtains location positioning needed to pre-configure sensor locations at deployment. The Global Navigation Satellite System provides location and time information in all types of weather anywhere on the earth. Information wearer needs is very easily attained via satellite based GPS.
  • Power Source: The smart chip is powered through wired electrical cord and the built-in power saving sources will kick-in automatically and operate for 72 hours. 

                                    

                                                               Figure 2-The Smart Chip Design

How It Works

  1. The smart chip automatically connects to Cloud App Engine and continuously runs 24/7 like a main server.
  2. This enable sensor to run in accordance with Cloud App Engine’s request.
  3. The smart chip periodically checks both the sensor database and video database via Cloud App Engine and automatically update data.  It then performs various necessary operations in the data base including create, read, update and delete (CRUD) data.
  4. The smart chip receives user’s request through the Cloud App Engine and it provides response that is related to user’s request back to the Cloud App Engine, which then feedback to the wearer.   
  5. It continuously communicate with partnered outlets via Cloud App Engine 24/7 and it then periodically checks for data and performs needed CRUD functionalities.   
                                 

User Community

  • Fully Blind or Visually Impaired Persons
  • Physically disabled Persons
  • Wheelchair Bounded Persons
  • Seniors and General population

The Device: AR Headphone

Description

AR Headphone is wearable technology that is hands-free and adept at communicating with multiple outlets.  It embraces features of augmented reality, which includes live casting of the physical environment and direct view of the real-world surroundings .  These AR components are accurately produced through sensory input such as sound, GPS data, and video.  It is easily operatable with voice commands and requires minimal physical touch to interact with the button, which is located on one side of the headphone.  This AR headphone is lightweight and all the components are fully covered by plastic in various colours.  It is highly durable in any weather conditions and very comfortable to wear.  It can be easily adjusted according to user's head size. It requires either WiFi or mobile connectivity with Bluetooth to deliver its promising features. The exception to this connectivity is MiFi, which is based on data plan.

The mechanism behind the AR Headphone is quite simple. It is totally dependent on the smart chip, radio transceivers and solar batteries.  The key to establish communication between AR device and the smart chip is activation of the on/off button and providing voice command and following the voice back prompts.   Figure 3 shows the design of AR Headphone.

Technologies Embedded

The following technologies are embedded underneath the surface of the AR Headphone

  • Ambient Intelligence: is an ubitiquitous computing device that sense shape, movement, and people and is in return responsive to their presence.
  • Smart Grid: is an electrical grid that uses information and communication gateways to acquire information and to act upon it.
  • 4G: Enables ultra-broadband internet access and exchanges data at a high speed (100 Mbit/sec).
  • Android OS (Jelly Bean): Is an open-source Linux-based operating system for mobile devices
  • Bluetooth: It enables wireless communication.  Using radio transmissions, this technology connects and exchange data with various devices in close proximity.
  • Wireless Router (MiFi): Is a hotspot that enables wireless high speed (300MB/s) internet access from any mobile network  
  • Radio Frequency Identification (RFID) tag:  Is an ID system based on radio frequency.  As a scanner, it collects, process and transmits information, including biometric, wirelessly.  As a location identifier, it reads and sends a unique identification number that enables to track products easily from a distance.
  • Wearable computing: Refers to any technology-mediated device that is that is wearable by the user and it enables wearer to engage in activities such as picture/video taking, reading/texing messages and browing the web.

Components & Accessories

  • Button
  • Finger Scanner
  • Battery (solar powered/rechargeable)
  • Speaker
  • Microphone
  • Camera (photo with 720pixel and video with 5MP)
  • Main Board
    • Microprocessor (CPU)
    • Memory chip
    • Gyroscope
    • Flash Memory (storage)
    • Compass
    • Accelerometer
    • Micro wireless router (data plan or prepaid card)
    • Wifi antenna
    • Bluetooth antenna
    • Micro USB
    • GPS sensors (built within the GPS chip)
                           

                                          Figure 3 - AR Headphone Design



Features

The following list of features are commonly found in other smart devices such as phone, glass and watch.  

  • Makes a call to the dictated telephone #
  • Automatically dial people on contact list by voicing their names
  • Send voice and text messages
  • Access internet to search for information or video chat with family and friends
  • Broadcast news and weather forecasts
  • Navigate places using Google Maps
  • Process online personal errands (banking, online shopping, food delivery service)
  • GPS chip:  Contains both transmitter and receiver within the chip and enables navigation of the surroundings based on Google Maps.
  • Record video or take a picture
  • Translate from one language to another

The following are special feature and benefits:  

  • Solar Power Source: AR Headphone power is consumed by sensing, communication and data processing with the smart chip.  Batteries are the chief source of power so it has been made eco-friendly and is naturally charged via solar power during outdoor operations by the wearer.  The batteries used in the AR Headphone is rechargeable by common power source and enables continuous indoor operations.  Energy harvesting techniques continuous AR Headphone operation anywhere at any time!
  • Ease of use: There is no limit to mobility, AR Headphone goes where ever wearer goes.  It is mobile and portable communication free of inhibiting wires.  It can withstand harsh environmental conditions.  It provides haptic feedback by finger scanner and touch button.
  • Personal Digital Assistance:   The smart chip is partnered with various outlets in the local communities.  These outlets includes educational institutions, entertainment centers, religious institutions, retailers, hospitals, private and public transportation providers and other community service providers.
  • Enriched Environmental Information:  The smart camera in the AR Headphone sense movement, shape, heat, objects and people, sends information to the smart chip which guides wearer to safety by narrowing down obstacles in the immediate surrounding and providing haptic feedback when wearer touch the objects.

How It Works

(Refer back to Figure 1 above)

  1. The wearer touch the button and the AR Headphone is turned “on”.

  2.  Then AR Headphone will ask wearer to voice command in order to generate a connectivity via one of the following: wifi, bluetooth or micro USB.  The exception is MiFi, in which system will not ask for voice command from the wearer and automatically connects to the data plan. 

  3. Once this connectivity is achieved, software application pertaining to this AR device begins to run. 

  4. The AR headphone then communicates with the Smart Chip (sensor) via Cloud App Engine (PaaS). The Platform as a Service (PaaS), provides cloud computing components to the software and thus it supplies application over the internet.  PaaS makes the development, testing, deployment of application much simpler, faster and more cost-effective.   

  5. The AR headphone communicates with the necessary sensor pertaining to user’s voice command input.  It then retrieves and update data in both sensor and video databases.

  6. The AR headphone then based on the retrieved data outputs the response wearer needs via voice back command.  

                   Side note: Response relating to wearer’s input command are pre-implemented in the application to be dictated as a voice back command.


Potential Uses by the User Community

At Store

  • Provide directions (right, left, straight) according to the person’s orientation
  • Anticipate turns, steps, narrow aisle and things on the floor in the surroundings
  • Indicate precise location of products (aisle #, shelf #)
  • Identify different kinds of products on shelves and in racks (including same product different brands available)
  • Precise description of products (weight, size, expiry, cost, care instructions)
  • Identify sale items and sale prices
  • Locate customer service desk or a store employee
  • Automatically count number of products in the cart and guide to right checkout line (express, self-checkout or regular)
  • Indicate wait-time based on line-up at the check-out
  • Based on payment method, identify denominations of bills and coins

At Hospital

  • Find services such as food court, washrooms, and nearest entrances  
  • Locate registration desk, emergency department, security personnel or staff member
  • Anticipate turns, steps, narrow spaces, corners, furniture  in the surroundings and provide direction according to their orientation
  • Identify type of doors (revolving, push/pull, automatic, sensor based wave hand)
  • Indicate wait-time based on line-up at registration desk or the # being called on the system
  • Automatically identify professional person entering the room by sensing outfit (nurse, doctor)
  • Orientate to their surroundings within the room; identify location of bathroom, call button, light switch and the door to hallway
  • Easily identify where objects within the room are; water pitcher, personal belongings, hand sanitizer, phone and tv
  • Automatically scan the food tray and identify the food items and silverware on it; sense temperature level provides precaution instructions for hot food items (tea/coffee) or room heater
  • Direct to pay station; identify denominations to pay for parking or indicate the correct location of visa/master card in their wallet to easily pick out and pay
  • Dictate sidewalk hazards, curbs, traffic signals and parking lot # of their personal parked vehicle
                

User Scenario

 Scenario1

Anne is a grade 11 student at Mother Theresa Public School and is medically diagnosed as a person with dyslexia, who has difficulty in learning to read or interpret words but does not affect general intelligence.   Today is Monday and she is on a day 2 schedule of her 2nd-semesterclasses, which is as follows:  8:55-10:20 Math, 10:25-11:40 Religion, 11:45-12:45 Lunch, 12:50-2:05 Chemistry, 2:10-3:25 English

Anne enters her classroom, sit quietly at her desk and opens her math book to the page teacher indicate and see all algebra starting back at her.  She begins to read it and the teacher begins to write today’s lesson on the blackboard.  Anne starts copying the note from the board as fast as she can into her notebook.  Teacher continues to talk and erase the problem from the board and begin writing a new equation.  The class goes on like this and Anne continues to copy as much as she could and rushes to squeeze it all into the given time.  In between, the teacher calls out to the students to solve a difficult question on the board.  As usual Jason and few others, the ‘smart’ team, volunteers.  Anne tries to follow these students and couldn’t understand them as one spoke too fast and the other too soft.  Teacher assign homework, Anne pack her things into her back and hope along through the hallway to her next class. She followed the expectation to the core, to come in, take a seat and quietly listen. This teacher returns all students’ essay on their own religion back to them, as teacher nears Anne’s desk, Anne shifts uncomfortably in her seat and gets anxious thinking the teacher will saying something about her marks in front of her peers, teacher hand out her essay quietly and return to the front of the class.  Anne looks down at her marked essay, she got D+ and as she consoles herself that at least she passed, the teacher starts his lecture for the day about beliefs, customs and traditions of Judaism and puts note on the projector screen. Anne again begins copying the notes, squeezing as much information as she can before the teacher changed the next sheet.  Anne sat down passively for the duration of the time, half listening and half jotting down notes.  She often turned around and looked at the clock, count down the minutes.  Finally over, it’s time for lunch.

After lunch, similar activities takes place in her Chemistry and English class, with two exceptions.  One, she had to experiment with chemicals for about 30 minutes in her chemistry lab.  Anne always loved the time in the chemistry lab as she didn’t have to sit anymore and she is able to experiment with her hands.  It always left Anne with energy for her next class.  Her English class is always as usual.  Sitting down in one place, quietly reading novel, passage, poem or the other and writing about something.  Today was full listening, the only difference is listening to her peers' presentation.  By 3’o clock, Anne was exhausted and put on a great effort to not slip into oblivion.  It’s now 3:25 pm and the day is over at last. 

It's now time to put on the AR headphone and activate the smart-chip!

Anne enters her math classroom so happy that her principal and teachers have given her permission to use AR Headphone in her classes.  She takes out her AR Headphone and she puts it on and authenticate her identity.  She then quietly whispers “please video record”, internally AR Headphone sends the signal to Cloud App Engine and it sends to the Smart Chip, which then sends a message back to Cloud App Engine, who in return sends AR Headphone which voice back “Recording starts now” to Anne who is already began closely watching the math teacher’s explanation of the algebra equation.  She is so happy that she didn’t have to write notes quickly anymore.  She attentively watch the teacher’s writing on the blackboard and even volunteers and goes up to the blackboard while having command “please pause recording” and having received confirmation “Recording paused” and solves the equation as she fully understood the concept.  She then comes back to her seat, put the AR headphone back-on and commands “resume recording” and again hears “recording in progress” message.   Anne ends her recording session by commanding “end video recording” and immediately she hears the message “video recording stopped.”

Sitting in her next class, Anne can’t wait to get her essay mark.  She knows she did well on the essay with the AR Headphone there is no way.   She has used it to get help and write the essay at home.   She has opened her religion book and command “Read opened page” and has listened to every single word being read back to her.  She then dictated “summarize opened page“ and listen to the summary, which further confirmed her understanding of the article.  She then commanded “Search internet and define Buddism”, and received response as “According to Lee Chan, Buddism is ….” following which she command “open MS Word” and received voice back “Word is now open”.  Anne then said “on speech recognition” and AR Headphone voiced “Begin dictating” so Anne has simply moved around her room folding and putting her closes away while dictating the full essay.  She dictated punctuation marks as well simply by naming them “Huge statues of Buddha can be seen in many cities in China comma, yet very few can be seen in Korea period” and the output on the paper would be “Huge statues of Buddha can be seen in many cities in China, yet very few can be seen in Korea.”  She has finished her dictation into the Word by commanding “Print” and the AR Headphone voiced back “Printing In Progress”.  Anne couldn’t believe her marks, for the first time she has gotten a A- for her essay. 

Throughout the day AR Headphone continued to enhance Anne learning experience.  From sitting passively for long period of time, to half listening to the lectures and jotting down notes fast and being so tired that she was not absorbing majority of the information presented Anne had come a long way.  She now was more alert, interested and engaged in the class activities.  She no longer is disengaged from learning, in fact Anne not only absorbed information, she had great grasp of the content, enjoyed learning and have become an active participant in her learning.

Scenario2

Ranveer is fully blind.  He is traveling from Toronto to Sulsback in Germany on emergency.   Travelling is a humongous task for him.   Ranveer gets help from his family in the airport, which includes ride to the airport, baggage handling and getting boarding pass from the Immigration Officer, until he is ready to go through security gateways.  From there onwards, he usually gets an escort, whose job typically was to guide Ranveer through the doors, gates and security personnel all the way to the flight door.  To get this escort service, he always have to wait anywhere from 15 minutes to 45 minutes. Then he is all alone in the flights, he rarely asks the attendant for help and only gets up once from his seating to use the washroom facility, until he gets off the flight, then again escort and family services.

This has been Ranveer’s regular travelling experience.  No more of this! 

Today, Ranveer is travelling with his brand new AR Headphone.  As he gets off from the car, he commands “escort needed to unload luggage from car.”  AR Headphone, connects with airport escort personnel automatically indicating Ranveer’s location and responds back to Ranveer “Escort on the way.  To arrive at your destination in 2 minutes.” 

Ranveer continue to dictate “guide to immigration officer servicing my flight” and through the AR Headphone, he hears “Go straight through the doors, turn right and continue walking 2 meters.” As Ranveer does as AR Headphone’s voice over, he has successfully navigated the hall, elevator, and ailse and has now arrived at the designated line for his flight.  Ranveer dictate “what is the wait time”  and AR Headphone, automatically connects with Immigration officer’s terminal and gets replies back “The estimated wait time is 50 minutes at the current speed officers are processing the boarding system.”  AR Headphone system enhances Ranveer’s travel all the way until he clears out from the Immigration system in Germany.  Ranveer turn off his AR Headphone when his uncle reach him.  With AR Headphone Ranveer’ has navigated the security systems, services and halls of both local and international airport completely independently thanks to Air Headphone.  This system uses the network of strategically selected Outlets in conjunction with cloud app to enable fully blind and visually impaired passengers to navigate airport terminal without any escort. The AR Headphone and the cloud app are completely intuitive for blind people.  With this device, Ranveer could go from Point A to Point Z without any problem since the audio voice-commanding system is guiding him every step of the way.  The AR Headphone with its preloaded detailed map of terminal, uses data from the Airpor partners triangulate Ranveer’s exact location and identifies each and every aspects within 25 meters range of his physical location.The AR Headphone system work in two ways, actively and passively.   For instance, when Raveer voice command “direct me first to the currency exchange booth and then to the food court”, the prompt is followed.  In any event, Ranveer does not wish to activate this service and is wandering around the airport killing time, AR will state “you are walking pass by corridor E, now you are bypassing the washroom” even without Ranveer commanding.  Additionally, should Ranveer want information he will get regular reminder notification of any flight delays, waiting time for departure, weather condition etc .The baggage area is a nightmare for even those that have great vision. Up until now, Ranveer has been no exception.  He will for the first time, know which carousel he is standing at and the colour, size and shape of suitcases going around and around. 

 AR Headphone system isn’t limited to airport and its services.   This system enable wearer to navigate any maze buildings and retail centers.  It also serves as evident of how technology for the blind people are pushing innovation life AR Headphone to the forefront.   Previous tools for these blind people were intended to assist with certain activities while indoor only.  Now this AR Headphone more than a simple navigating tool, it in fact is the tool of empowerment and independency of blind.  

 

Future Scope

 

-       More lighter version in the future

 

-       More user oriented applications

 

-       Augemented Reality to the next level

 

-       Partners with various outlets 

 

Related Resources

http://www.slideshare.net/nidhinpkoshy/google-glass-ppt?next_slideshow=1

 

http://studymafia.org/wp-content/uploads/2015/01/CSE-Google-Glass-report.pdf

 

http://www.cin.ufpe.br/~rar3/uploads/2/0/3/5/20356759/bookchapter_published.pdf

 

http://www.slideshare.net/nidhinpkoshy/google-glass-ppt?next_slideshow=1

 

http://studymafia.org/wp-content/uploads/2015/01/CSE-Google-Glass-report.pdf




 

 

  • No labels

3 Comments

  1. Made comments on the smart chip augmented experience doc, since it was easier to draw attention to individual issues within your points.

    Some general comments on what you've put together so far:

    • Your scope is way too big. You describe people who are blind or partly sighted, deaf (but you omit hearing impaired), mute or speaking impaired, physically disabled (i.e. wheelchair-bound)...it's all too much. Any accessibility device that helps all these groups will either feature technology that serves everybody (which is a zero set), or technology for every group (which is more than you could cover in a single project). 
      • I would recommend that you pare it down somewhat. You already seem to be doing so in your points above, which seem to be targeting blind people. Maybe just stick to a single group, to provide your project with more focus and precision.
    • Many of the things that your device claims to do are currently impossible or unfeasible, given the current state of the art. Individually, object recognition and text recognition and room navigation are really hard problems, but you're suggesting real-time room + object + text recognition together, which is simply beyond what huge computers can do right now, never mind smart chips. Check out some of the things that the TAGlab is working on right now, to give yourself an idea of the current limitations of camera technology.
    • You need to talk to some of the people that you're trying to help. Whenever you're designing for a group that you don't belong to, you need focus groups to help understand what that group's needs are. I work with enough blind people to know that you're inventing problems where there aren't any, and you need to be able to answer confidently if somebody asks you whether these features solve a real problem or not.

    There's also an issue of device overload, but that might be solved by reducing your scope and including the minimal set of devices (ideally one) that will provide the necessary services. Remember that adopting new devices isn't something that people do willingly, both because of the cost issues and the learning curve issues. So there has to be a severe pain point that your device addresses, and it has to address it so well that people will overcome their natural hesitation in order to try it out. Try to flush out a single idea well, and polish that to a shine, instead of trying to make everything for everybody.

  2. Hi Steve, 

    Thanks a lot for your wonderful feedback.   They have tremendously helped me to narrow down my scope of this project.   I have updated my project and would greatly appreciate any feedback at your convenience so that I can enhance my project for the final.  Again, thanks a lot. 

  3. I agree - the scope of the project concerns me. Not everyone is going to be assisted by the same piece of technology, and that's my main piece of feedback. I think Steve is right, that you're primarily focusing on blindness - this wouldn't really help someone that's paraplegic. How does this complement technologies already available to these individuals? Have you consulted anyone that's blind on what would be most helpful for them?