Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Take part in our daily and weekly newsletters to get the latest updates and exclusive content for reporting on industry -leading AI. Learn more
As companies rush to adoptThey discover an unexpected truth: Even the most rational company buyers do not make purely rational decisions – their unconscious requirements go far beyond the conventional software evaluation standards.
Let me share an anecdote: it is November 2024; I sit in a New York skyscraper and work with a fashion brand on their first AI assistant. The Avatar Nora is a 25-year-old digital assistant who is exhibited on a six-foot kiosk. She has slim brown hair, a chic black suit and a charming smile. She waves “Hello” when she recognizes a customer’s face, nods when you speak and answer questions about company history and technical news. I was prepared with a standard -technical checklist: answer accuracy, conversation latency, facial recognition precision …
But my customer didn’t even threw on the checklist. Instead, they asked: “Why doesn’t she have her own personality? I asked her favorite handbag and she didn’t give me any!”
It is striking how quickly we forget that these avatars are not human. While many are worried that Ai blurred the boundaries between the borders People and machinesI see a more direct challenge for companies: a fundamental change in the assessment of the technology.
When the software begins to look human and act, users stop evaluating them as a tool and assess them as a human being. This phenomenon-according to human standards to assess anthropomorphismthat was well examined in human relationships and is now being created in the relationship between man and AI.
When it comes to procuring AI products, corporate decisions are not as rational as they may think because decision-makers are still humans. Studies have shown that unconscious perceptions shape the most Human-to-human interactionsAnd corporate buyers are no exception.
Therefore, companies that sign a AI contract not only conclude a “supply contract” that applies for the cost reduction or sales growth. They enter into an implicit “emotional contract”. Often they don’t even recognize it themselves.
Although every software product always has an emotional element, this aspect, when the product is infinitely similar to a real person, becomes much more prominent and unconscious.
These unconscious reactions shape the way their employees and customers deal with AI, and my experience tells me how widespread these answers are – they are really human. Consider these four examples and your underlying psychological ideas:
When my customer asked Nora’s favorite handbag in New York, who longed for her personality, she picked up Theory of social presenceTreat the AI as a social being that must be present and real.
A customer has fixed himself on the smile of her avatar: “The mouth shows many teeth – he is worrying.” This reaction reflects them Uncanny Valley effectwhere almost human characteristics cause complaints.
Conversely, a visually appealing, but less functional AI agent due to the Aesthetic usability effect – The idea that attractiveness can outweigh the performance problems.
Another customer, a meticulous business owner, delayed the project start. “We have to make our AI baby perfect,” he repeated in every meeting. “It has to be flawless before we can show it to the world.” This obsession to create an idealized AI unit Ideal self On our AI creations as if we create a digital unit that embodies our highest efforts and standards.
How can you run the market by using these hidden emotional contracts and gaining your competitors who are currently stacking an imagination AI solution After another?
The key is to determine what is important for the special needs of your company. Set up a test process. This not only helps you to identify the top priorities, but, which is even more important, smaller details, no matter how emotionally convincing. Since the sector is so new, there are almost no easily usable playbooks. However, you can be the first scope by determining your original method to find out what best suits your company.
For example, the customer’s question about “the personality of the AI -AVATARS” was validated by testing with internal users. On the contrary, most people could not see the difference between the different versions that the business owner had fought back and forth for his “perfect AI baby”, which means that we could stop at a “good” point.
To recognize patterns, you should set team members or consultants who have a background to psychology. All four examples are not unique, but are well -researched psychological effects that occur in the event of interactions between people and humans.
Your relationship with the tech provider must also change. You have to be a partner who navigates experience with you. You can set up weekly meetings with you after signing a contract and share your snack bars from the test so that you can create better products for you. If you do not have the budget, at least additional time buffer to compare products and test with users so that these hidden “emotional contracts” can appear.
We are at the top of how people and AI interact with each other. Successful managing directors will accept the emotional contract and set up processes to control the ambiguity that helps you to gain the market.
Joy Liu has managed Enterprise products at KI startups and cloud and AI initiatives at Microsoft.