Tech and the City Calendar | Tech and the City
Choose your own career hacks, Melbourne.
Tech, data, culture, workshops. https://ellebrooker.wordpress.com/tech-and-the-city-free-events-in-melbourne-australia/
Choose your own career hacks, Melbourne.
Tech, data, culture, workshops. https://ellebrooker.wordpress.com/tech-and-the-city-free-events-in-melbourne-australia/
This was the question posed at Deloitte Australia’s ‘Conversation – Designing the Interface of the Future’ event, hosted in conjunction with ‘Disruptors in Tech’, a Melbourne based MeetUp group.
I have to admit, quite what a brand sounds like isn’t something I’ve ever really thought about before.
Close your eyes for a moment.
Imagine a world in which every product in the shop is blaring about itself in natural language. Possibly, even holding a conversation with you, calling you by name, fielding your questions and answering them based on preferences it gleaned from the data that you verbally offered up; your smart devices silently told it (your GPS, purchasing and browsing history); your loyalty accounts, biometrics and even the embedded chip in an existing purchased item that you’re currently wearing, that is interfacing via Near Field Communication, informed it about, all serving as context.
It’s not that far-fetched, but it’s hardly everybody’s idea of Utopia.
I instantly assume that every spokesperson, every brand ambassador and every voiceover talent who ever carved out a career is facing becoming automated, and that an audio apocalypse of the kind ‘Minority Report’ foreshadows is on the cards.
On reflection, I fervently hope that the option to switch to classic mode, by which I mean ‘return to mute’ isn’t overlooked, or else that I can default everything to Patrick Stewart or everyone’s favourite meerkat, Alexsandr Orlov.
(I’d be pretty keen for it NOT to sound like Microsoft Catherine.)
Currently, I’m mid way into a two part webinar, learning how to structure and design text based chat bots, with Microsoft Worldwide online, so this long-ago lecture, (which pre-dates the blog, and is one of its three inspirations,) is starting to resonate with me, but in a way that makes the chat bot idea seem a little bit old fashioned.
Ha!
In my humble opinion, text bots take up far too much on-screen real estate. Especially on mobile. And I say that as someone who loves writing and reading, (but not chatting or instant messaging,). Since I’m on a roll, for the love of all that is good and worthwhile, if I’ve just agreed that your site may install a cookie, my dear UX designers, please don’t follow that up a split second later with a request to take a survey about my experience of the site.
Blink.
You know that I literally don’t have any experience with the site yet, don’t you? I mean, you just installed a cookie a second ago, so it invites the question, complete with raised eye brow, “what experience, prithee?”
For the UX and CX designers confusing metrics with success; high pressure tactics with what people want; bells and whistles with colour and interest; and making account closure more than a two-step process, peppered with mildly threatening, condescending “warnings” that imply I don’t know what it is I’m really doing, I do not need yet another reason to switch to a low footprint lifestyle.
Capiche?
Getting back to the subject at hand….
I think it’s self evident what a Rolex watch would sound like (Roger Federer, obvs.) But what does an Australian shiraz sound like? How about the bus stop timetable? And will the bot have a name, or will it be openly robotic? Devoid of personality?
This event is one of my favourites.
It’s so good, that it’s taken a few weeks to process and decide how to best present it.
This edition of Disruptors In Tech was held at National Australian Bank’s 700 Bourke Street outpost, (not to be confused with its dockside 800 Bourke Street headquarters, or its two or three other Bourke Street properties which, although equally imposing, are also the utterly wrong address for this particular event)* showcases the bank’s thought process and design considerations as it prepares to launch a chat bot.
A bank bot?
Hmm.
I’m unfamiliar with any scenario where I might be so caught up that I have to make an urgent bank transaction and my hands won’t be available, but OK. People do strange things when they’re in transit.
Lessons learned:
1. Designing a conversational user interface (CUI) is more fraught than you think.
Lesson 2: personality is hard to do.
As it happens, creating a bot with a flat personality, or no personality (and no name) is just as complex as its alternative. In having no personality, the bot still has a personality. Just not a very sassy, cool or chatty one.
Compound the problem with an assistant that has to flatly, blandly and consistently cover multiple divisions and myriad product lines of a newly agile, complex business, (with the added bonus that the bank is currently investing its time in realising that being complex is not an excuse for not being a disreputable corporate citizen) and this makes for an interesting case study.
Have they got it right?
Lesson 3: Spoken word is a different animal to text.
Human conversation is less formal, more shorthanded and incomplete than dialogue. As a result, conversation bot chat shouldn’t aim to replicate, or in any way be substantially based on, text.
(The sound of a low flying sunk cost whooshes by, while the trumpet heralding a serious new overhead plays.
Choose your interface wisely.)
I have to say, this is kind of a bummer for enterprise if they jumped on the wrong bandwagon and are being leapfrogged, but it’s a boonoonoonus for those who write and script for a living, because if you installed a chat bot: the kind that pops up in a window, asking “can I help you?” (a bit like Clippy used to do,) under no circumstances should you use it as the basis for your new conversational interface.
If someone is calling about their home loan, you can’t just text-to-voice your website or ‘play to end,’ as that’s not how people converse.
This is because even the punchiest of chat bot text comes across as stuffy and long-winded.
NAB’s bots triage and escalate non-self service problems to actual humans, on the fly, whether the channel is text or voice, ensuring that you don’t have to repeat yourself once you and you’re compound issue are connected.
Lesson 4: Testers will be jesters
The bank found that in testing the feature some customers could not help but test the limits of the bot to answer non-banking related questions such as “how tall is a horse?”
The flat answer this bot provides to such fodder is that it cannot assist with the question, but in principle it could be programmed to answer all questions put to it, one day, once the basics are covered and are up and running beautifully.
End scene.
“No one wants to participate in daily Turing tests.”- Dennis Mortensen, X.AI
Conversational UI. This means Conversational User interface.
In plain English is means Alexa, or Siri, an agent that you interact with verbally. Like these guys are doing:
The Scots in the elevator asking for level eleven
On the last day of winter, I venture in a different direction and into the City of Yarra.
Portable is a digital design and technology company, and an example of what my service design friends at Academy Xi referred to as “agency” (no pronoun required.)
Portable tell me that they’re interested in access to justice and various good causes. They have government clients too, which I always find interesting. Today’s talk centres on conversational user interfaces (called conversational UI, or voice activated commands and interactions) and is one of a series of talks that they’ll be hosting.
The next one is this Tuesday at 8:30am.
(Register here if you’re interested)
It will nominally be about infrastructure, but based on a typo that I found in the blurb, I’m really rather hoping that it touches on ‘technoclogy‘.
I’m greeted at the door by a cavoodle called ‘Pepe’, a ragamuffin who proceeds to treat my scarf, coat and fingers as oversize chew toys. Pepe punctuates the meeting with painful half pleas to his handler to be released immediately to at least look at the table with all of the food on it. (Breakfast is complimentary and consists of all kinds of yummy bagel and patissierie goodies that I wasn’t expecting. Much appreciated.)
I’m not sure what law this is, but for brevity’s sake let’s call it Perton’s law. The law being: ‘that which worked fine in rehearsal, will not work at all, when required to.’
It affects everyone in the cutting edge of technology space, presenting to a live audience ninety nine percent of the time and is directly proportional to the importance you’re placing on creating a good first impression.
The guest speaker is participating via video link from New York.
His name is Dennis Mortensen. The lag in the feed from New York makes Dennis, who is the Chief Executive of a virtual assistant company, seem as though he may himself be a programmed avatar.
Dennis speaks without moving his lips. He also has a Danish accent of the kind sci-fi films like to give to homicidal robot ladies and the generic neatness of every hipster in the tribe.
This is early morning performance art, and I’m not the only one to notice. The illusion that Dennis is Max Headroom incarnate makes at least one other person do a double take.
Uncanny!
Dennis’ company, X.AI, is an automated meeting scheduler and/or virtual assistant, that comes in two varieties, Andrew and Amy.
These are personal assistants. The kind that will mean you don’t have to employ an Executive Assistant (which I think underestimates the status symbol having an EA represents,) nor will you personally have to make your own dinner reservations only to be duped into speaking to the restaurant’s bot. (Just like the Google Duplex demonstration, demonstrated.)
Much of Dennis’ presentation covers agents, which he likens to macros, rather than the conversational bits of the interface, which is what I came here to hear about.
Bots will mutely automate a cluster of simple, but cumulatively, time consuming tasks, like scheduling and receipts and renewals. A bit like using auto suggest to fill in an online form or predictive text in an SMS or mobile document.*
Once developed, you’ll be able to install them, run them, and they’ll automate things like bookings, then, at some point, they’ll start speaking to one another, bot to bot and hopefully, be able to automate an entire sequence of events, building on your past choices and decisions. For example: an entire movie screen play or, less creatively, the flow on effects of a decision to push back a meeting (which is Dennis’ chosen example.)
In the future, your virtual assistant bot/s and the bots that your service providers deploy will communicate and arrange everything from your intial travel, car rental, hotel and insurance to extensions and extra insurance, and the recheduling of other appointments, and auto suggest places of interest that you might like to visit, or restaurants you might like to order from, (although you will still need to physically turn up and experience these in real time with NO setbacks along the way for this to operate at peak efficiency.
The day opens with the admission that Artificial intelligence (AI) is “hard to define.”
This is an emerging theme. For the second or third time in as many weeks, I’m struck by thee lack of distinctive words capturing new developments in tech. There’s also a reticence to be definitive about concepts that borders on the incomprehensible.
For example: “ground zero.”
Really… that’s the best you can do?
Also “service redesign.” How can something to self evident be so difficult to define?
I’ve already noticed that this lack of vocabulary is resulting in the regurgitation of the same already borrowed, ill fitting words, to mean their exact opposite. For example: ‘hack’. This can be bad, as in “my account was hacked” or, it can be positive, because “hacks” are the outcomes of ‘hackathons’).
Dennis refers, without pause, or explanation to “disruption” by which he means an actual old school disruption, although I might not have called it that, back in the day.
Lately ‘disruption’ has been exclusively used in a positive way, when actually, at best it’s ambivalent and traditionally, it’s meant a temporary halt to proceedings.
The kind of disruption that Dennis means includes when a website hangs or the AI misunderstands either you, or your intention. (See the Scots in the elevator sketch for an example of this.)
At an event in July about the deployment of chatbots, one of the speakers, referred to “raw chicken moments.”
(Ghastly, I know, but wait.) A raw chicken moment is extremely relatable. It’s any time when your hands are full and you can’t use them to do things like answer the phone or press play, or speak on the intercom or touch the remote control, because what you’re doing is a bigger priority.
It’s during raw chicken moments that voice commands will come into their own.
I know I would appreciate my phone not ringing or pinging when I’m in the middle of something. Ie I’m having a raw chicken moment. There are times when I wish it would snooze, or mute, but I’m too busy to take the time to manually make it do this. If I could yell at my phone to go to voicemail, and it would do that, or go to assistant I’d appreciate it. Since I’m already yelling at it to shut up and stop interrupting in my head half the time, we are already more than half way there as far as me adapting is concerned.
One of the big bugs for agency, (as it is for management consultants) is time taken to track billable hours. To combat this opportunity cost, Portable are experimenting with Dennis’ AI agent.
For the record, I’m unclear how this automated bit of kit that, well, schedules meetings, takes less time to record billable hours than the humans whose hours it’s recording, but I do understand that it’s meant to leave said humans more time to work on things that generate income.
I hazard a guess that the bot includes middle ware that can evaluate the diary entry fields: ‘who’ the meeting was with and the duration of said meeting, generating a line item in the bill as a result. I guess that must be how it works…
I question whether the value proposition is so niche that it technically it poses a minimally unviable commercial problem.
in his opening gambit Moretensen relates a familiar tale. All you want to do when you land somewhere is find out where the food is, whether there’s a pool or laundry, and orient yourself.
Instead you wind up in the hotel room having to log in to the wifi, and then download an app and well…
I like that Dennis is against businesses creating in house apps, (especially when a website would do just as well, looking at you booking.com and Air BnB and Culture Trip).
I like that he is against the the time, data and phone storage waste that apps represent to a user
As we established earlier in the series, I’m a thin client,low code limited app kinda gal. This is because storage was at a bit of a premium during my recent trip overseas, due to the age of the phones I took with me. I have no time for pig path apps, with bloated code, and the impact they have.
But.
Dennis solution is to install an Alexa in every room, so that as we unpack we can ask questions and multi task, which is great. But. He is also forgetting that hotels have real live concierges, porters and sometimes butlers (thank you Raffles) as well as direct dial in house phones with speaker functions…things that users are already used to using and expect to have access to.
So, he’s missing a trick a bit with that suggestion.
Do we need to know whether the thing on the other end of the line is human or not?
Dennis thinks disclosure is important, but if you’re in any doubt the test to apply is “what can I win and what can I lose?”
This is a test that any reasonable human being interested in doing a good job ought to be asking All. The. Time.
He adss that business is all about trust. People find it weird to be subjected to daily Turing tests, so don’t be weird.
Are we moving towards a post human future?
No.
*Dennis briefly touches on motor vehicle AI and how it “dissappears.” I’m not sure I grasp what he means, but I will say that auto drive presents it’s own unique challenges vis a vis hacking and desirability. If people liked safe driving we wouldn’t have car culture.
People like status symbols and risk taking. To be robust the business / MVP / user case has to factor human factors in.