Sarah Luger, PhD sat down with host Alex Olesen, VP Vertical Strategy & Product Marketing at Persado to share some key ways humans and AI work together and could collaborate in the future. This partnership between humans and AI offers an uplifting perspective. AI automates mundane tasks at scale, leaving humans with more time to complete more complex tasks. But, what does this mean in the short term and in the long term? According to Sarah, right now, ChatGPT and GPT-based technologies are being used a lot in marketing, writing, and creativity-based tasks. Generative AI is great for writer’s block and people can easily figure out how to engineer prompts thanks to decades of using Google. On the enterprise level, the relationship between AI and humans is based on trust. Not just humans trusting AI to complete tasks quickly at scale, but also using AI to add value to the customer experience. Forward-thinking companies use AI to learn customers’ preferences and to provide more personalized experiences and recommendations.
“At Orange, we take our customer engagement extremely seriously. We are customer-centric. Our concern is a great customer experience and we know that we’re sitting on a gold mine of data. Now we have a tool that allows us to personalize and create even better experiences,” said Sarah.
Sarah hopes to see in the future how large language models will transform economic and global engagement, especially when it comes to underserved languages and communities.
Episode Transcript:
1
00:00:01.150 –> 00:00:05.030
[email protected]: All right. We are now recording. So I’m going to get started in a couple of seconds.
2
00:00:08.500 –> 00:00:09.570
[email protected]: All right. Here we go.
3
00:00:10.500 –> 00:00:19.100
[email protected]: Welcome back to motivation. AI matters. Today I am really excited to be joined by Sarah Luger.
5
00:00:33.850 –> 00:00:36.530
[email protected]: Sarah, thank you for joining us today.
6
00:00:38.190 –> 00:00:53.090
sarah: Thank you so much, Alex. It’s a real pleasure, as you know. I’ve been working in the space for some time, but also excited about Prisado’s role in this space. So thank you again for having me. My name is Sarah Luger.
7
00:00:53.210 –> 00:01:00.069
I got my Phd. At University of Edinburgh many years ago in artificial intelligence, and I would be.
8
00:01:00.480 –> 00:01:18.790
sarah: I’m being quite honest to say that I’m surprised and excited about the developments in this space. In the past I’ve worked at startups. I’ve worked at Ibm building a precursor to the Ibm. Watson jeopardy challenge robot.
9
00:01:18.820 –> 00:01:28.430
sarah: I’ve also been at Orange Silicon Valley for 5 years. We’ve worked on numerous topics, including voice by metrics. Chat bots call center technology.
10
00:01:28.480 –> 00:01:30.540
and of course.
11
00:01:31.100 –> 00:01:35.130
sarah: everything that has to do with large language models and generative. AI,
12
00:01:35.320 –> 00:01:39.469
I’m really excited, especially because, as you may or may not know.
13
00:01:39.690 –> 00:01:56.930
sarah: Orange is a large company with almost a hundred 40,000 employees, and we’re in 27 countries. Many of those are in North and West Africa, and many of our customers speak languages that are low resource, which means they don’t have a lot of training data.
14
00:01:57.030 –> 00:02:12.949
sarah: They don’t have an online presence that supports the kind of data and that’s used in conventional translation systems to create high quality translations. So I’ve been working a lot in that space. And I do see both potential
15
00:02:13.080 –> 00:02:20.550
sarah: for large language models to support our customers, but also some peril. And I’m looking forward to chatting more about that today. Thank you.
16
00:02:20.720 –> 00:02:26.429
[email protected]: I think that’s fantastic. And you’ve got a great background. I know you’ve
17
00:02:26.580 –> 00:02:38.099
[email protected]: you’ve seen this industry evolve through through numerous iterations. It’s most recent what is now being referred to as generative. AI.
18
00:02:38.370 –> 00:02:50.530
[email protected]: I know you. You just touched on a lot of really interesting topics. We’ll dive into over the course of this episode. But to give the the listeners a good baseline
19
00:02:50.730 –> 00:03:03.009
[email protected]: in your own words, could you define what this new term generative AI means, and then talk to me a little bit about who generative AI is important for
20
00:03:04.990 –> 00:03:06.020
sarah: okay?
21
00:03:06.390 –> 00:03:09.710
sarah: So I think
22
00:03:09.820 –> 00:03:15.070
sarah: for the the average person out there. General, today I
23
00:03:15.360 –> 00:03:25.520
sarah: is AI, where the output resembles human content. It resembles a language that is either
24
00:03:25.540 –> 00:03:28.680
that seems like it’s constructed by a human
25
00:03:29.000 –> 00:03:54.640
sarah: or technically generative AI systems are based on algorithms that learn from the a vast amount of input data. And the most recent cases that we’ll dig into. That would be all of the digital data that’s on the web as well as some knowledge bases knowledge bases being things like Wikipedia that give structure and associate terms. And
26
00:03:54.700 –> 00:04:02.799
sarah: some apparent meaning to this, this vast sea of of language data. And so
27
00:04:02.820 –> 00:04:05.690
sarah: what’s going on under the hood is that there is
28
00:04:05.840 –> 00:04:14.769
sarah: this vast amount of data is being used to learn the patterns of how we as humans speak
29
00:04:14.950 –> 00:04:26.620
sarah: and how we write, and with innovations, both from Google’s 217 transformer paper incredible compute innovations
30
00:04:26.700 –> 00:04:31.960
as well as just ongoing neural networks developments.
31
00:04:33.070 –> 00:04:41.310
sarah: There’s the possibility, as many of us have have now tried since November thirtieth, 2,022, when Chat Gpt was launched
32
00:04:41.340 –> 00:05:01.849
sarah: to engage with an generative AI system in a way that most people had not engaged with an AI system, you know. Perhaps in the past you had AI, a secondary characters in a video game, you know, or there’d maybe been some predictive analytics in an enterprise app
33
00:05:02.070 –> 00:05:15.019
sarah: application you were using. But the core of generative AI is using these patterns of words at at a vast scale. That then for us makes it seem like
34
00:05:15.220 –> 00:05:17.369
sarah: this computer, is
35
00:05:17.590 –> 00:05:25.859
sarah: it? Almost? It’s almost a a human like content that’s being output. And it’s really a powerful difference between
36
00:05:25.870 –> 00:05:36.760
sarah: systems from even gosh, even 6 months ago, right? We’ve had 7 months ago. We’ve had a seat change. And your second question is, who is it
37
00:05:37.180 –> 00:05:39.400
sarah: most?
38
00:05:39.790 –> 00:05:43.560
sarah: Who is it most important? Who is? Who is it most important? For
39
00:05:44.540 –> 00:05:49.109
sarah: who is it most important? For well, right now the
40
00:05:49.190 –> 00:05:57.200
sarah: we’re in. We’re in the the hype cycle, and it’s and it’s a little bit of it’s important for everyone.
41
00:05:57.240 –> 00:05:59.539
This is great for everything.
42
00:05:59.570 –> 00:06:11.029
sarah: And I respect the hype as someone who’s in Silicon Valley, because I understand the role that it plays and the duality of of
43
00:06:11.410 –> 00:06:22.590
sarah: of how we get an investment and how we build prop products and how we have to compete with other hype cycles, be them metaverse. most recently in blockchain.
44
00:06:22.670 –> 00:06:26.100
sarah: But I think that this is really important for
45
00:06:26.200 –> 00:06:29.799
sarah: creating customer-centric tools
46
00:06:29.970 –> 00:06:32.610
sarah: that support
47
00:06:32.830 –> 00:06:46.540
sarah: voice bots textual. You know, marketing text. I think that marketing and customer support are the first areas that are going to see innovations in these really
48
00:06:46.810 –> 00:06:49.360
sarah: human seeming
49
00:06:49.390 –> 00:06:54.050
sarah: engagements that can be created for their customers. So
50
00:06:54.560 –> 00:07:02.539
those are the 2 areas that the people I see that it will most of affect. But then I want to also flip that and say.
51
00:07:02.620 –> 00:07:06.619
sarah: I think the Holy Grail of enterprise.
52
00:07:06.790 –> 00:07:15.259
sarah: Innovation that isn’t as shiny and sparkly as some of the other. you know. Key terms I’ve just mentioned
53
00:07:15.280 –> 00:07:19.839
sarah: is enterprise intranet. Search.
54
00:07:20.020 –> 00:07:36.159
sarah: so the ability to search through a company’s resources, to answer questions for employees, or answer questions for employees that are then passed on to customers. I think that that is
55
00:07:36.430 –> 00:07:45.519
sarah: really key cause it will help you and I do our jobs better and reduce mundane tasks and reducing
56
00:07:45.820 –> 00:07:47.410
Monday and task.
57
00:07:48.060 –> 00:07:55.780
sarah: It’s something that AI in general is, you know, aims to do any technology. We try to elevate our
58
00:07:55.790 –> 00:07:58.139
sarah: our work tasks up
59
00:07:58.220 –> 00:08:10.129
the the difficulty chain. So we want, as humans to not do the same thing every day, but to understand patterns
60
00:08:10.180 –> 00:08:13.969
sarah: and reduce repetition
61
00:08:14.020 –> 00:08:29.960
sarah: and do more and more challenging tasks, and those more challenging tasks are very hard for computers, so don’t fret. Many of us will still have jobs. On the other hand, those lower level tasks are.
62
00:08:30.000 –> 00:08:39.289
sarah: really great opportunities for computers to come in generative AI systems to come in and support us.
63
00:08:41.820 –> 00:08:51.909
[email protected]: Yeah, you you’ve raised, you raise some really interesting points, some of which, you know we’ve discussed in in prior conversations. But I’ll I’ll summarize those quickly for the listeners.
64
00:08:52.080 –> 00:09:08.989
[email protected]: You know we we’ve we’ve talked about. And and I I love the way that you put it. You know, some hype cycles around the metaverse around blockchain. I think we’re we’re definitely seeing a surge of a hype cycle around generative AI,
65
00:09:09.050 –> 00:09:19.740
[email protected]: you know, to another one of your points. The barrier to entry for consuming this type of technology in in my observation has been lowered.
66
00:09:19.860 –> 00:09:23.610
[email protected]: as of November of last year.
67
00:09:23.630 –> 00:09:26.399
[email protected]: you know, chat
68
00:09:26.660 –> 00:09:36.730
[email protected]: so readily available for the average consumer. you know, broaching the prosumer pro prosumer market.
69
00:09:36.860 –> 00:09:37.900
[email protected]: And
70
00:09:37.910 –> 00:09:44.730
[email protected]: you know, as you’re alluding to breaking into the enterprise. Space as well.
71
00:09:44.910 –> 00:09:47.030
[email protected]: I think, is going to have
72
00:09:47.100 –> 00:09:59.210
[email protected]: widespread benefit both in terms of the way consumers interact with this type of technology, but also the way large enterprises similar to orange
73
00:09:59.340 –> 00:10:12.729
[email protected]: derive value from this type of technology. But I do want to touch on another point that you’ve made, which is unique to the other conversations that I’ve had in the last couple of months
74
00:10:12.840 –> 00:10:19.369
[email protected]: which are the implications around education and career development.
75
00:10:19.430 –> 00:10:27.680
[email protected]: I love what you said because it goes. It goes against the grain, in my opinion
76
00:10:27.870 –> 00:10:32.050
[email protected]: of what we have seen in the news around job displacement
77
00:10:32.230 –> 00:10:50.360
[email protected]: and and the automation of tasks. I think you delineate very well, and artificial intelligence will help eliminate some mundane tasks. But you put a great perspective on the state of the market by saying it will free
78
00:10:50.380 –> 00:10:59.559
[email protected]: humans up to do more challenging tasks, and I find that to be a very pragmatic and and and uplifting perspective.
79
00:11:00.400 –> 00:11:04.460
al[email protected]: if you could expand on that a bit, Sarah.
80
00:11:05.040 –> 00:11:06.130
[email protected]: What
81
00:11:06.240 –> 00:11:09.140
[email protected]: would you? What would you say
82
00:11:09.430 –> 00:11:16.780
[email protected]: some interesting applications of humans and AI working together
83
00:11:16.920 –> 00:11:21.819
[email protected]: might be that we haven’t necessarily considered yet.
84
00:11:24.690 –> 00:11:25.560
sarah: Great?
85
00:11:26.330 –> 00:11:30.300
yeah. And I I appreciate you.
86
00:11:30.720 –> 00:11:42.030
sarah: your perspective as well, because I think that we’re at a a point where. because of the hype, there are some
87
00:11:42.930 –> 00:11:48.020
sarah: main primary narratives that seem to be
88
00:11:48.380 –> 00:11:57.640
they. They seem to be extremes. They’re either fear, based or or wildly at outlandish. And you know the the reality is somewhere in between.
89
00:11:57.730 –> 00:11:59.160
sarah: I’ve heard that
90
00:11:59.230 –> 00:12:03.270
with new innovations we grossly
91
00:12:03.350 –> 00:12:10.729
sarah: out, we overestimate the near term change, and then underestimate the long term, change
92
00:12:11.010 –> 00:12:27.699
sarah: and that’s not my quote, but I I strongly agree so to your point. What are, what are the ways that this hasn’t been using? And it hasn’t been clear and and with collaboration between humans and
93
00:12:27.790 –> 00:12:30.179
these tools. What will be
94
00:12:30.550 –> 00:12:39.399
sarah: more? what will become clear in the in the near term? So what is happening now, what do we see now?
95
00:12:39.910 –> 00:12:58.110
sarah: Right now, chat Gpt and Gpt. Based technologies are being used in a lot of marketing, writing, and text and ideation and creativity based tasks.
96
00:12:58.110 –> 00:13:21.119
sarah: So they’re really good for this blank page problem. Maybe call it writers. Block call it Also, from a prompt perspective. You know, we’ve all grown up to some extent feeling comfortable with Google searches. And so we we can figure out how to engineer prompts that give us an output that we’re looking for.
97
00:13:21.260 –> 00:13:32.460
But humans and computers working together has been There’s a long standing history of this. And in artificial intelligence we’ve developed agents
98
00:13:32.640 –> 00:13:39.060
sarah: which are some little little software
99
00:13:39.140 –> 00:13:57.239
sarah: products that help humans complete tasks. And there’s been research in this multimodal, you know, with sensors, what is there. There’s a lot of use in industry. But I think that the next wave of
100
00:13:57.590 –> 00:14:02.659
sarah: this technology will be using the
101
00:14:02.720 –> 00:14:15.749
sarah: trust which a lot of users have gained from this first iteration. And you know, I later, I want to maybe push back on that trust. But if we think of this as kind of a consumer life cycle.
102
00:14:15.910 –> 00:14:24.009
sarah: using the trust that the consumer has from the output of their large language model
103
00:14:24.030 –> 00:14:26.460
sarah: chat interface
104
00:14:26.520 –> 00:14:30.389
sarah: and turning that into a resource
105
00:14:31.230 –> 00:14:44.590
sarah: that maybe takes the place of a Google search. Right? So I think that chat interfaces are going to transform the way that humans work
106
00:14:44.690 –> 00:14:45.540
sarah: with
107
00:14:46.460 –> 00:15:01.670
sarah: computers in the fact that we may no longer have Google page rank be as an important in terms of Google being the font of all knowledge. Maybe that verb itself. I’m gonna Google. It will
108
00:15:01.780 –> 00:15:11.079
sarah: will cease to be as relevant. I think we’re going to have a split in our society where people who have become very
109
00:15:11.090 –> 00:15:23.469
sarah: adept at using chat technology may go to use this as their primary resource for new information, for for information discovery.
110
00:15:23.760 –> 00:15:35.740
I think that that could have profound implications on the Enterprise search and our ecosystems, you know, if we think of trying to make a decision about, say, is
111
00:15:35.970 –> 00:15:43.760
sarah: Siri, or Google Home or Apple home pod. What is what is the best smart device
112
00:15:43.840 –> 00:15:46.620
sarah: it’s hard to evaluate, which is
113
00:15:46.670 –> 00:15:51.900
sarah: in a you know, just a discrete test.
114
00:15:52.260 –> 00:16:03.200
sarah: But maybe, as as many consumers do, people make decisions based on their ecosystem. So I imagine in the future that
115
00:16:03.290 –> 00:16:06.279
sarah: open AI will leverage this
116
00:16:06.660 –> 00:16:16.670
sarah: collaboration with Microsoft to really pivot the enterprise solutions that the average person spends a lot of their time on.
117
00:16:16.830 –> 00:16:18.010
and by
118
00:16:18.240 –> 00:16:19.980
sarah: changing that habit.
119
00:16:20.620 –> 00:16:25.100
sarah: Anytime you use an ecosystem, you’re providing them with a lot of data.
120
00:16:25.200 –> 00:16:38.960
sarah: And when you provide an ecosystem with a lot of data, it allows them to personalize and learn your your preferences. And this could change the way that we online
121
00:16:39.440 –> 00:16:45.780
sarah: receive, ask for and receive new information, answer questions. And
122
00:16:45.860 –> 00:17:00.449
sarah: could could that pivot could reshuffle the companies that we get many of our goods and services from, or at least our initial point of contact for those goods and services, because I do think that
123
00:17:00.540 –> 00:17:07.520
sarah: especially younger consumers, will feel more comfortable using these tools
124
00:17:08.140 –> 00:17:09.640
sarah: as
125
00:17:09.670 –> 00:17:13.109
sarah: as a co-author, as a co-pilot
126
00:17:15.890 –> 00:17:28.250
[email protected]: that that’s really interesting. And and, Sarah, you’re you’re touching on a a topic that actually comes up with a lot of clients that I speak to. You know, we’re in an interesting
127
00:17:28.680 –> 00:17:40.300
[email protected]: environment from a macro perspective. At the moment there have been macroeconomic headwinds that many.
128
00:17:40.650 –> 00:17:53.389
[email protected]: you know, large mid market startup businesses have to navigate in terms of personnel decisions, R&D investment decisions
129
00:17:53.660 –> 00:17:57.960
[email protected]: specifically within tech, there have been a lot of reductions in force.
130
00:17:58.480 –> 00:18:02.189
[email protected]: So let me just paint that as the
131
00:18:02.480 –> 00:18:09.760
[email protected]: personnel backdrop for humans interacting with machines simultaneously
132
00:18:09.870 –> 00:18:15.489
[email protected]: this new technology appears seemingly out of nowhere last year
133
00:18:16.820 –> 00:18:29.850
[email protected]: to the untrained I it can come across as this, all seeing oracle that can answer any question. So if I’m sitting from the perspective of
134
00:18:30.540 –> 00:18:34.430
alex.ol[email protected]: someone in the workforce that spells a threat.
135
00:18:35.030 –> 00:18:44.960
[email protected]: I love the direction that you’re taking your point of view, because the way that you phrased it and and I will have to look up this quote. We
136
00:18:45.490 –> 00:18:51.040
[email protected]: overestimate the near term impact and underestimate the long term impact.
137
00:18:51.670 –> 00:18:53.399
[email protected]: And what I’m finding
138
00:18:54.200 –> 00:18:58.689
alex.o[email protected]: in my personal experience with our clients is
139
00:19:00.140 –> 00:19:02.220
[email protected]: these Llms
140
00:19:03.150 –> 00:19:08.840
[email protected]: content, assistance or generation tools. They help alleviate
141
00:19:08.990 –> 00:19:13.850
[email protected]: writers block. They help get the first notes on the page.
142
00:19:14.250 –> 00:19:20.570
[email protected]: But that human expertise of motivating someone to act.
143
00:19:20.790 –> 00:19:24.670
[email protected]: of understanding the nuance of
144
00:19:24.850 –> 00:19:26.930
[email protected]: what’s being discussed
145
00:19:27.380 –> 00:19:32.249
[email protected]: is it is from from what I’m gathering, from what you’re saying, not quite.
146
00:19:33.290 –> 00:19:41.750
[email protected]: And it maybe this is 2 points, not quite the capability and not quite the near term intent of this technology. And I I think
147
00:19:41.860 –> 00:19:43.210
[email protected]: that that is.
148
00:19:44.430 –> 00:19:52.050
[email protected]: I think that that is a very good dynamic to exist specifically in the environment that we find ourselves in. Now.
149
00:19:54.100 –> 00:19:58.589
sarah: Alex, can I? Let me ask a clarification? So
150
00:19:58.740 –> 00:19:59.970
sarah: are you.
151
00:20:01.450 –> 00:20:08.319
sarah: I agree, completely with what you said. But is your question.
152
00:20:09.860 –> 00:20:17.150
sarah: I I think. Let me just. I kind of started thinking about one word you said intent human intent.
153
00:20:17.480 –> 00:20:25.640
sarah: And I think one of the challenges with the output of large language model systems
154
00:20:26.220 –> 00:20:28.580
is that we’re calling them.
155
00:20:29.170 –> 00:20:31.290
sarah: We’re calling that output information.
156
00:20:32.440 –> 00:20:37.259
sarah: And I’m not sure if that’s a a great word.
157
00:20:38.270 –> 00:20:42.870
There’s a researcher I spoke with last week
158
00:20:43.480 –> 00:20:50.090
sarah: who would like us to remember how large language models work on the inside
159
00:20:50.200 –> 00:20:59.430
sarah: and view the output as synthetic media. because information is constructed by a human.
160
00:20:59.570 –> 00:21:00.540
sarah: And if
161
00:21:01.090 –> 00:21:05.790
sarah: if words are not constructed by a human.
162
00:21:06.490 –> 00:21:08.999
is it information?
163
00:21:09.110 –> 00:21:26.929
sarah: And and this is a this is a very interesting discussion, and I think that this is nothing that we’re going to solve today. But it goes. It touches on another topic in this space, which is anthropomorphizing these these systems, as you as you noted, you know, they’re
164
00:21:27.480 –> 00:21:33.260
they’re impressive. And if you don’t know the inner workings. Or if you’re new to this space.
165
00:21:33.310 –> 00:21:38.250
sarah: or actually, that’s not fair. If you’re a human. you you
166
00:21:38.490 –> 00:21:40.560
automatically
167
00:21:40.710 –> 00:21:44.489
sarah: give intention to text
168
00:21:44.650 –> 00:21:56.639
sarah: you, you give you make meaning out of the kind of bag of words that you get back. And this bag of words is really good. And so it’s our natural tendency
169
00:21:56.830 –> 00:22:08.139
sarah: to to find meaning. I mean, think of conspiracy theories like we try to find meaning in in desperate facts. And this is this goes back to
170
00:22:08.390 –> 00:22:09.510
sarah: our
171
00:22:10.830 –> 00:22:15.670
sarah: our core. Human animal.
172
00:22:15.690 –> 00:22:26.339
sarah: backgrounds. You know, this is this is how we probably survived. on the African belt. Right? We needed to understand signal. So
173
00:22:26.670 –> 00:22:31.510
sarah: when I I hear the word hallucination.
174
00:22:31.640 –> 00:22:37.190
sarah: I also get a little bit concerned that we’re anthropomorphizing these systems.
175
00:22:39.220 –> 00:22:50.759
sarah: and I think maybe a better word would be error and answer. And by answer for more. 5 z these systems. And in using the word hallucination we’re about, we’re evoking
176
00:22:50.940 –> 00:22:55.560
sarah: spirituality, creativity. very
177
00:22:56.750 –> 00:23:05.190
sarah: ideas and words that are very close to human souls. And again, I want to take a step back and say, this is a machine people.
178
00:23:05.260 –> 00:23:07.920
it’s it’s an error.
179
00:23:07.990 –> 00:23:13.610
sarah: And so I just I just kind of went on that tirade because of you mentioning intention. But
180
00:23:13.790 –> 00:23:17.690
sarah: I think. I think, harnessing
181
00:23:18.300 –> 00:23:26.820
sarah: our human intention to complete tasks, to work together with each other. you know, to build companies and
182
00:23:26.930 –> 00:23:29.760
and support customers.
183
00:23:30.000 –> 00:23:42.770
sarah: We can use these tools to do that. But I I want to push back on hallucinations and perhaps information as as output.
184
00:23:42.990 –> 00:23:45.060
sarah: And then
185
00:23:46.140 –> 00:24:06.549
sarah: yeah, sorry, Alex, please. No, no, no need to apologize. Quite quite the contrary, Sarah. I am always looking for a good reason to weave the word anthropomorphize into a conversation, and I’m glad that you took it.
186
00:24:06.730 –> 00:24:12.050
[email protected]: Well, it’s interesting, right? Like it’s really interesting how we do this. And
187
00:24:12.880 –> 00:24:19.570
sarah: it’s not like it’s not a bad thing. It’s a natural thing. And this is why we
188
00:24:19.790 –> 00:24:26.220
sarah: you know, this is how we relate to our pets. I mean, you know, this is like
189
00:24:26.690 –> 00:24:36.510
sarah: you, you know. I’ll fight you if you don’t think my cat totally understands what I you know this is a totally human behavior, but
190
00:24:37.190 –> 00:24:39.880
sarah: I it is, and it’s something that that
191
00:24:39.980 –> 00:24:43.049
sarah: that has a
192
00:24:43.360 –> 00:24:52.900
sarah: It has it. It’s a core part of who we are, and there’s a reason that it’s in our the fabric of our humanity. but
193
00:24:53.190 –> 00:24:54.660
sarah: it also
194
00:24:54.820 –> 00:24:58.880
sarah: in this case confers trust
195
00:24:59.260 –> 00:25:03.180
and trust, is something that should be earned.
196
00:25:03.580 –> 00:25:06.500
sarah: and I like the fact
197
00:25:06.790 –> 00:25:08.930
sarah: that you know.
198
00:25:08.940 –> 00:25:23.180
as you said, a lot of people are saying, Oh, chat you! It is bad for education. hey? Chat Gpt has gotten a lot of folks who had not previously played with AI playing with AI. It’s lowered the barrier to entry
199
00:25:23.200 –> 00:25:24.740
sarah: in this space.
200
00:25:24.900 –> 00:25:30.390
There’s a ton of things that I had to learn in school that are now, you know, solved problems.
201
00:25:30.830 –> 00:25:31.970
sarah: you.
202
00:25:31.980 –> 00:25:50.090
sarah: you know the ensemble learning of the past is completely out the window. It’s great that I understand what’s going on. But if the tools are out there, there’s great documentation. There’s a great open source community. Then we have a lot of people entering this workspace
203
00:25:50.110 –> 00:26:04.230
sarah: who are going to be because because some of these current iterations are so fun they’re sticky. They’re good products, as as you and I in in the Bay Area would say.
204
00:26:04.420 –> 00:26:09.280
sarah: But as you also mentioned, that means that the
205
00:26:09.420 –> 00:26:13.330
sarah: that’s that’s in conflict with the macroeconomic trend
206
00:26:13.360 –> 00:26:20.859
sarah: of letting people go in a lot of mid-size companies, even the larger companies. And I think that there’s
207
00:26:21.120 –> 00:26:24.400
sarah: a an essence of
208
00:26:24.570 –> 00:26:39.430
sarah: a lot of mid-size companies that have AI teams are taking a step back and saying we were going to build this internally. And now we can buy this. And so we don’t need some of these internal people.
209
00:26:39.720 –> 00:26:40.560
and
210
00:26:41.180 –> 00:26:42.550
sarah: I think
211
00:26:42.850 –> 00:26:44.910
that if you look at
212
00:26:45.520 –> 00:26:50.770
sarah: the current strategic ecosystem.
213
00:26:51.100 –> 00:26:59.410
sarah: the strategically, this is exactly what open AI and some of the these companies want to do.
214
00:26:59.760 –> 00:27:03.920
sarah: they’re looking for customers. and
215
00:27:04.140 –> 00:27:26.679
sarah: to some degree the the price point has been so low that people that it’s really lowered the barrier in that dimension as well. It’s an accessible price point to learn these technologies, and then, when you’re playing with them at home. You bring them into the workspace and say, Hey, I’ve got some great ways that we can automate.
216
00:27:26.900 –> 00:27:36.320
sarah: Some workflows. but more broadly if you look at You know these, these macroeconomic trends.
217
00:27:37.710 –> 00:27:44.260
sarah: the open AI lost 54 million dollars and 22, and it has a hundred employees.
218
00:27:44.840 –> 00:27:45.640
sarah: The
219
00:27:45.940 –> 00:27:53.719
sarah: the technologies that we’ve been talking about are extremely extremely compute, intensive, which means
220
00:27:53.760 –> 00:28:05.160
sarah: they’re probably 2 companies who are making money out of Lms right now. One is Nvidia. and the other is whatever power company they were.
221
00:28:05.270 –> 00:28:10.600
sarah: They were accessing to run these these scrapes of the web.
222
00:28:13.060 –> 00:28:27.159
sarah: I doubt it’s Pg. And the as a Pg. And a customer So, Sarah, I want to touch on you. You can cut that. I don’t want them coming after me. That sounds good.
223
00:28:27.260 –> 00:28:35.190
[email protected]: Sarah, I want to touch on a themed. We have alluded to throughout the episode.
224
00:28:35.800 –> 00:28:38.150
[email protected]: and that is the team trust.
225
00:28:39.090 –> 00:28:42.250
sarah: as I think. So, yeah.
226
00:28:42.290 –> 00:28:45.080
[email protected]: as these companies
227
00:28:45.650 –> 00:28:56.389
[email protected]: be them. Large enterprises start ups even providers in this space. as these companies embark on their journey
228
00:28:56.400 –> 00:29:01.410
alex.ol[email protected]: to either build Lms. Incorporate generative AI
229
00:29:01.540 –> 00:29:12.080
[email protected]: into their go to market. How can companies and the trust of their consumer base
230
00:29:12.140 –> 00:29:27.820
[email protected]: and use this technology in a way that will not feel invasive or not feel like. it is infringing on the privacy or the the rights of the consumer.
231
00:29:30.360 –> 00:29:34.259
Great question. I I think that this is going to be.
232
00:29:35.930 –> 00:29:42.250
sarah: You know, I do want to give respect to open AI for creating a great product because
233
00:29:43.190 –> 00:29:46.960
sarah: Lms have have been around vectorized.
234
00:29:46.990 –> 00:29:49.609
sarah: data
235
00:29:49.640 –> 00:29:54.439
word embeddings have been around, but they they created a front end. That that was.
236
00:29:54.580 –> 00:29:57.870
sarah: It’s a great experience. and
237
00:29:58.210 –> 00:30:09.370
sarah: I think that that in gender trust, in a way that they can parlay into building a
238
00:30:10.670 –> 00:30:25.570
sarah: partner and a customer business that will have some really big, impressive names. What does that mean? You know, when the rubber hits the road for everyone else. No.
239
00:30:25.740 –> 00:30:27.780
sarah: when you come into
240
00:30:28.060 –> 00:30:33.770
sarah: a situation where you’re trying to win a sale from
241
00:30:33.890 –> 00:30:38.129
sarah: a customer, I think it’s really important to say, what?
242
00:30:38.930 –> 00:30:57.289
sarah: How can I define my company? And what do I have that makes my company unique? And in most cases this question, before November thirtieth, 2,022 would be, I have a a relationship.
243
00:30:57.400 –> 00:31:07.420
sarah: and I with this customer. I have relationships with customers like this customer, and I have a long standing
244
00:31:07.740 –> 00:31:09.470
data set
245
00:31:09.580 –> 00:31:11.430
sarah: that includes
246
00:31:11.490 –> 00:31:29.060
sarah: well documented metadata of these relationships. Okay? So maybe that was a computer scientist describing an enterprise customer engagement. But I think that that fundamentally has not changed. If you came into a
247
00:31:29.100 –> 00:31:34.260
sarah: space where you are providing a service, and you have
248
00:31:34.370 –> 00:31:45.980
sarah: a great customer field. And because of that ongoing engagement over years, you have a ton of data that reflects your customers.
249
00:31:46.090 –> 00:31:49.849
sarah: Then you are going to be in a great situation
250
00:31:50.160 –> 00:31:52.319
sarah: in the large language model realm.
251
00:31:52.450 –> 00:32:05.930
sarah: Why? Because you’ve already gained the trust, and you’ve already created great results for your customers. And now you’re just laying a a technical
252
00:32:06.110 –> 00:32:13.129
sarah: technology on top of that, a technical layer that says we can
253
00:32:13.280 –> 00:32:29.280
sarah: optimize this past data even more, we can surface new insights. We can respond to your questions in a more naturalistic way. And we can have deeper
254
00:32:29.450 –> 00:32:53.260
sarah: information discovery from our pre existing information. Using some of these, vector, based approaches. Right? So when you think about trust, I don’t think about someone who’s coming to the market tomorrow with, I can solve X, y, and Z with large language models or chat Gpt, for
255
00:32:53.680 –> 00:33:07.300
sarah: you know, road cleaning, whatever it may be. But I do think that trust is about relationships, and from a technical perspective, it’s about data. And if you can leverage that data
256
00:33:07.550 –> 00:33:20.730
sarah: to create naturalistic experiences for your existing customers, you’re going to make them happy. And you’re going to gain more data is key and data is what
257
00:33:20.990 –> 00:33:31.749
sarah: is really amplified. The large language model scene. So I think a lot of companies need to understand that their data is gold.
258
00:33:31.790 –> 00:33:50.039
sarah: At Orange we, we take our customer engagement extremely seriously. We are customer-centric. Our our concern is great customer experience. And we know that we’re sitting on a gold mine of data. We know past preference.
259
00:33:50.280 –> 00:34:04.840
sarah: And now we have a tool that allows us to personalize and and create even better experiences, you know. And that’s we’re amplifying that trust
260
00:34:06.480 –> 00:34:20.589
[email protected]: fantastic. Well, Sarah, I know we’ve we’ve covered a lot of ground today. and thank you for all of the insight. I have one final question now that we’re on the topic of Orange and the projects that you’re working on.
261
00:34:21.150 –> 00:34:28.960
[email protected]: What What? That’s on your plate right now or in your pipeline excites you the most that you’re working on.
262
00:34:30.170 –> 00:34:33.969
sarah: thank you. Because
263
00:34:34.580 –> 00:34:39.539
sarah: I’ve been talking a lot about generative, and
264
00:34:39.770 –> 00:34:50.789
sarah: I I I hope this is as articulate as as as needed. But the thing that excites me the most is working on
265
00:34:50.969 –> 00:34:56.769
low resource language projects. We have a lot of customers
266
00:34:56.810 –> 00:35:00.150
sarah: in our West African. Well.
267
00:35:00.290 –> 00:35:11.150
sarah: okay, let me start at the beginning, Hi, my name is Sarah, and I’m really interested in low resource language projects. This is why I joined Orange 5 years ago.
268
00:35:11.780 –> 00:35:18.680
Oranges in a particularly well suited position to support
269
00:35:19.480 –> 00:35:34.140
sarah: our customers in numerous areas. We have a bank, we have video game company. We’ve entertainment streaming services. We lay Internet cable. We have satellites, we you know. What don’t we do? Well.
270
00:35:34.210 –> 00:35:35.330
sarah: we
271
00:35:35.420 –> 00:35:41.349
sarah: are focused on providing great consumer experiences to everyone.
272
00:35:41.450 –> 00:35:49.150
sarah: and many of our customers are in places that have underserved languages.
273
00:35:49.450 –> 00:36:01.050
sarah: So orange is in Francophone, Africa. But a lot of people in these regions no longer speak French is not is not considered
274
00:36:01.460 –> 00:36:06.989
a necessary language as as much as new trends in
275
00:36:07.000 –> 00:36:11.779
sarah: in political and community
276
00:36:12.170 –> 00:36:19.770
sarah: economics spring up, you know, there’s ability to to buy and sell products locally.
277
00:36:19.890 –> 00:36:31.249
sarah: And with that local focus a lot of people have been returning and to local languages that are spoken
278
00:36:31.640 –> 00:36:48.650
sarah: or comparatively underserved. And so what does that mean? It means that with the shifting dynamics of these areas as well as other, you know, macroeconomic forces, we have a lot of customers who do not speak French.
279
00:36:48.690 –> 00:36:52.640
sarah: or would prefer not to speak French.
280
00:36:53.030 –> 00:36:53.910
sarah: Now.
281
00:36:54.180 –> 00:37:07.009
sarah: I describe to the uninitiated low resource languages as languages where there’s no reddit, although in the past week that illusion is less powerful. But
282
00:37:07.040 –> 00:37:26.570
sarah: it means that there’s not a lot of data online. So I am very passionate about selling people products in whatever language they would like, and providing goods and services and whatever language they would like. So recently, I ran a machine translation competition between French and Bombbara. That was text based. We had
283
00:37:26.930 –> 00:37:34.860
sarah: 20 participants cash prizes. The winters are from all over the globe, and well.
284
00:37:34.960 –> 00:37:48.069
sarah: only a couple of the participants actually spoke Bambara. Almost all of the participants in the competition spoke, or had family members who spoke a different low resource language. And
285
00:37:48.140 –> 00:37:58.590
sarah: this was a really impressive experience, because it reminded us that no matter how much the large language model hype cycle
286
00:37:58.830 –> 00:38:02.260
sarah: you know, brings us cool, sticky
287
00:38:02.280 –> 00:38:04.259
sarah: products and
288
00:38:04.280 –> 00:38:16.720
sarah: and crazy new applications, and really And you know the Pope and a puffer coat whatever it may be. I think it’s really important to remember that
289
00:38:16.840 –> 00:38:31.330
sarah: natural language processing is not a solved domain. And there are a lot of challenges out there for the global the global population that are not English centric or not French centric.
290
00:38:31.360 –> 00:38:38.210
sarah: And I’m hoping that you know, when we were working on this last iteration of the translation project.
291
00:38:38.240 –> 00:38:51.019
sarah: Everyone was saying, Okay, large language models have dropped. How relevant are they for for this domain. And then so our next step is to experiment with that and figure out, how can you
292
00:38:51.030 –> 00:39:02.640
sarah: augment large language models for languages that are basically small language models and and you know, have their own orthographic challenges.
293
00:39:02.710 –> 00:39:12.449
sarah: And at the end of the day it’s really about customer service and people. So I think it’s it’s been a really great experience. And I’ve met a lot of really interesting people.
294
00:39:12.570 –> 00:39:20.619
sarah: and I trust that that will continue to be a great way to leverage new technologies
295
00:39:20.640 –> 00:39:23.330
sarah: with a global
296
00:39:23.470 –> 00:39:29.129
sarah: economy and global engagement. But I do want to reiterate that
297
00:39:29.580 –> 00:39:39.040
sarah: Llms have not solved AI. We there’s probably going to be a plateau right now, where
298
00:39:39.240 –> 00:39:46.870
sarah: open AI has not has mentioned. They don’t have plans to release a Gpt. 5 barred
299
00:39:47.010 –> 00:39:51.170
sarah: different, you know the llamas
300
00:39:51.190 –> 00:39:52.590
different
301
00:39:52.660 –> 00:39:58.190
sarah: models are out there, and there’s been a lot of optimization around
302
00:39:58.210 –> 00:40:15.360
sarah: compute and running these models on on smaller and smaller memory devices. And so I would watch for to see what Apple is up to, because I think they’re going to. They’re going to jump on the scene with a pretty interesting privacy.
303
00:40:15.440 –> 00:40:18.919
sarah: oriented solution as is their brand.
304
00:40:19.160 –> 00:40:30.080
sarah: But I’m I’m really excited about the power of communication and the power of speaking to people in their own language. So watch the space for more machine translation
305
00:40:30.240 –> 00:40:34.400
sarah: and low resource language innovations.
306
00:40:35.490 –> 00:40:44.690
Well, well, thank you, Sarah, and you know this has been a a fascinating discussion, one which I think the audience is really going to appreciate, not only for the
307
00:40:44.800 –> 00:40:49.650
[email protected]: information and insight that you’ve provided, but I think also a very
308
00:40:49.830 –> 00:40:59.740
[email protected]: actionable optimistic and pragmatic point of view. again for the listeners. This was Sarah Luger.
309
00:40:59.880 –> 00:41:12.280
[email protected]: Sarah, thank you very much for joining. I really enjoyed our conversation. and I’m looking forward to the next one.
310
00:41:14.060 –> 00:41:16.500
sarah: Thank you, Alex. It was a pleasure.