Generative AI: developing human potential
Human by Design : How AI unleashes the next level of human potential - Erin Kinnee
Erin works at Accenture to find more innovative ways of working and to work smarter and more efficiently. Technology Vision 2024 is Accenture's trends of what is coming as ten years ago was predicting that everything in the cloud, it is a lot of clever people taking in lots of information and doing interviews with leaders and people and leverage this to create these trends.
This theme of Human by Design - How AI unleashes the next level of human potential has been talked about for years and decades we have had to make technology work for us rather than the other way around. A projector and screen can convey information, but it is not really humanist in its design but is not intuitive in its design. Have you ever dropped your mobile on your face, they are great and give access to a world of information but are still unhuman in their design, our hands are not square, and people fall over when walking and texting at the same time. Laptops are wonderful things to access a world of information but if don't have right setup can have eye strains or repetitive strain injury.
ChatGPT came out eighteen months ago and everyone was using it and knew how to use it to get answers in a language they understood and get information quickly. This brought in this new wave of human-centred design by those creating the tech vision, if we design things that are intuitive and easy to use and think like us and work like us then people will be more likely to use it.
Four trends are reshaping our relationship with knowledge, ecosystems for AI, creating value in new realities and a new human interface. A match made in AI is how we are changing relationship with data and how we access it and the way that it works, not just giving us answers but also processing information like a human would. Meet my agent is about ecosystem of AI agents to take actions to command major aspects of business and take a further step back to make it do more work for us. Spatial computing is growing, and enterprises will need to find the killer apps and for a new human interface for interacting with computers in more human centric ways.
A match Made in AI
A change from searching to answering, with searching that has been with Google and Bing, but with ChatGPT we are asking for answers not searching for a list to trawl through. If you search for something on Google, the first result will be an advert or the result of sponsors and adverts and will come to a point where Google is not the way we search for information but want to be provided some answers. We need to think about how to structure information so that it can be surfaced correctly in the future. The way we take information and the way we access it is commonplace, we would know how to access and use a website without having to give instruction as apps are very intuitive but with AI it could be anything that is displayed. There are devices like the Rabbit where you can speak to it and get it to do things and get what you need by asking for it such as asking to play something from Spotify.
A New York Lawyer misuses ChatGPT, as we know that LLMs can hallucinate, although a lawyer in New York used ChatGPT to write a brief that cited cases and presented it to a judge, but it turned out the cases did not exist although all those mentioned did, the judge was not best pleased about this. When doing something trivial with ChatGPT and don't have to worry about it doing something wrong but in this case, they had to apologise to all those involved and there were serious repercussions for this. Writer uses a knowledge to graph to fact check AI generated content so it will create the content and will tell you which parts of the content need to be checked and where it got the information from.
Meet my agent
AI starting to take action, enterprise AI is all about taking back-office processes and making it easier for people to find information and requires the person to take action on the information provided. There will be new roles to police actions that are performed by an AI, there was an example of a company called Do Not Pay which finds subscriptions you aren't using but wouldn't do anything for you where the CEO asked it to find him money and find fees that aren't used and being overcharged for things and generate letters to negotiate a rate down and contest parking fees on their behalf so it was able to take care of a lot of the admin and cognitive load. Someone wanted to use AI to free up their time to be more creative not free up their time to do the dishes they wanted the AI to do this instead. There's an appetite for AI to do more but there has to be balance about not taking away the good things in life. Google, Stanford & Princeton teach AI to make tools to solve problems, they create two set of AI tools one to identify a problem and the other to execute a solution to create a tool, with the first being a smaller model and the second being more complicated.
The Space we Need
Creating value in new realities, this is trend to tackle that question about the Metaverse which included Facebook changing name to Meta about living in the Metaverse with headsets and this put that into the line of sight of a lot of corporations, but we're not living in that yet. The underlying tech of spatial computing has been around for years, and we have been looking at how to leverage the physical digital divide. How can you use VR without it looking gimmicky or forced? In the last five years the way market has changed, and headsets are more comfortable and seeing the evolution has been fascinating. Metaverse and spatial computing has leapfrogged ahead in its ability to be used by businesses, needed specific skill sets to light and build things in 3D but now it is a lot easier to create these environments and can scan an image and create a 3D model to be put into a virtual environment. Problem with metaverse was where it was quite difficult to leave one world and join another world and was difficult to keep that continuity but now there is a lot of standardisations on how we experience those environments and people have been waiting to get on board but if wait for that saturation point then it will be too late but generative AI can help bridge that physical / digital divide.
Our bodies electronic
A new human interface, the next generation of technology to better understand humans with brain / computer interfaces like Neuralink or even eye tracking, but can now do more consumer type things and quicker. We live in this world about where people are passionate about how their data is used, people want to write into their privacy policies how they safeguard brain-wave information for some companies. How can businesses use this as have to rely on users to give more information to allow them to have a better experience. If you make a product-based business if you hand over a product then won't know how are reacting to a product unless the person tells you as don't often know what is going on in a person's head so can use technology to indicate this on how people are reacting to products. The impact of devices includes a brain computer interface headset to control a robot dog with 97% accuracy using brain waves along with using brain computer interfaces to demonstrate the delay between our brain seeing and understanding it and our motor skills actioning that information to find out about the brain's pattern recognition ability when looking at X-rays of items and use the information from the brain to determine what was seen on the X-ray and then label these correctly when recognition was triggered. There were also enhancements to moving prosthetic limbs in a more seamless way by using the intent of the person to create a smoother experience.