Alexandra Levitt is a Workplace Futurist, Bestselling Author, WSJ Columnist, and CEO of Deep Talent.
Below we’ve compiled the best excerpts from her earlier appearance on The DEX Show. Levitt’s book, Deep Talent is in stores now.
What’s the pulse on AI today?
>>>There’s still a lot hype out there.
AI is still very much in the hype cycle and you have a lot of anxiety around people who feel like we needed to do Gen AI pilots yesterday and we need to be incorporating AI into every piece of operations within our business! And the truth is, we are lacking very clear use cases of where a particular problem is solved (with AI).
A couple of years ago, we saw a similar phenomena with the metaverse where everybody was talking about it. Generally speaking it’s more of a philosophical question, but no one was being ultra specific on how they're going to get from point A to point B.
What’s being overlooked?
>>>Gen AI and Large Language Models aren’t this huge leap like people say they are.
Generative AI and Large Language Models aren't necessarily the huge leap that I think people think that they are. I mean, really it's just a logical progression of how smarter technologies have been getting over the last couple of decades. But until we have something like artificial general intelligence where these algorithms can start making independent decisions and changing things without input from a person or a data set, that's when we, I think, really are going to see a transformation within society.
How should organizations implement AI innovations?
>>>Slow and steady wins the race, start with small pilots, and experiment endlessly.
Just talking about it (AI) isn't going to transform your business. So there's two things:
- You want to look for ways to make it very tangible, which will, in my opinion, involve small pilots throughout the organization and a lot of experimentation; and
- You also want to not get too worried about being behind. Because I think this is the kind of thing it's better to do slow, it's better to do it sensibly and do it right, than to just kind of jump on the bandwagon.
What advice do you have for IT job seekers in the future?
>>>Double down on the oft-ignored “human skills” in IT.
It was about 10 years ago that I started talking about tech workers and that things were going to shift fundamentally in the technology world when you have a population that has been highly employable, highly marketable, and highly sought after for decades… it became pretty obvious to me early on that at some point they're going to be the first ones to be automated because those are some of the tasks that are most natural to give over to machines..
And I worried a little bit about tech workers because I thought: are they not keeping their human skills up to par while they're resting on their hard technical skills? And when I say human skills, I'm referring to things like intuition, empathy, judgment, problem solving, creativity. And these are things that a lot of IT folks haven't had to develop.
It's not that they don't have them or they're not interested in having them. It's just that that hasn't been the focus.
How can they (IT) add value to these roles in the realm of oversight? Whenever you insert a machine into a traditionally human driven process, you still need a person to design it, to oversee it, to fix it when it's broken, to figure out how to redeploy it, and then to communicate its insights to business leaders.
That's a lot of jobs that IT people who were previously just programmers could take over and step up into the light and take those roles. So that's the long answer. The short answer is: the roles are evolving to be much more well-rounded.
To improve the workplace, what needs to happen between employees, employers, and IT?
>>>More IT “stewards”; general AI-knowledge for employees; and employers taking more responsibility for their culture.
Employers
I constantly see this tension between employers and employees regarding whose responsibility is it to make sure you have the right learning for your job. Employers would like to say it's the employees responsibility, especially with the democratization of a lot of content.
So companies would like to have employees taking more of the onus on themselves. And employees feel the opposite. They're like, if I'm working for your company, you should be telling me what I need to do in order to upscale and do my job more effectively.
And so if you're a company, you're the one whose reputation is on the line. So I think that taking the responsibility for making sure your employees are learning what they need to learn and doing things the right way is very important.
You've got to build culture from the inside out. You can't use an AI to improve culture. You got to get it right by asking employees and really taking the step to co-create new norms in this post pandemic era.
Employees
First of all, you have the general population that needs to develop greater IT skills because you have all of a sudden regular individual contributors from a variety of functions being empowered by AI-based technologies to make decisions that would previously had to be routed all the way through the organization. Now you have people who can work with technology in order to do their jobs and in some cases beyond their jobs more efficiently and effectively. And I actually call those applied technology skills.
IT
For tech workers I think that is a massive need that will save not only a lot of tech workers jobs, but save a lot of organizational reputations.
Because what I see happening is that these technologies are going to be deployed. They're not going to have proper oversight. There's going to be a big boo-boo that's made. And all of a sudden, everyone will be pointing the fingers at each other.
But really, if IT stewards had been there in a proper oversight role, then maybe that wouldn't have happened.