Transcript
1
00:00:00,017 --> 00:00:04,417
Hi, Sandy Estrada here. Welcome to this week's episode of How I Met Your Data.
2
00:00:04,577 --> 00:00:07,237
This week, we have a long one. It's a bit of a double feature.
3
00:00:07,537 --> 00:00:10,537
Anjali and I are going to talk about a couple of conferences that we've attended
4
00:00:10,537 --> 00:00:14,897
over the last few weeks. And we also have a special guest, Rachel Workman.
5
00:00:15,077 --> 00:00:17,477
She is the head of data at Sound Commerce.
6
00:00:17,717 --> 00:00:22,617
And I am so excited to share Rachel with you. I've known Rachel for seven years.
7
00:00:22,737 --> 00:00:24,737
She is a bottle of lightning.
8
00:00:24,917 --> 00:00:29,337
She just has a very unique perspective. She has worked both as as an engineer.
9
00:00:29,517 --> 00:00:32,577
She's worked on the business side. She's worked as a management consultant,
10
00:00:32,697 --> 00:00:34,557
and now she's backed on the engineering
11
00:00:34,557 --> 00:00:39,177
side as head of data for SoundCommerce. So let's get into to it.
12
00:00:39,440 --> 00:00:57,200
Music.
13
00:00:56,517 --> 00:01:00,037
That was your week. I was on vacation last week. Prior to that,
14
00:01:00,117 --> 00:01:04,637
I went to a couple of conferences, one of which you attended as well.
15
00:01:04,817 --> 00:01:06,657
A little bit of travel and meetings, etc.
16
00:01:06,997 --> 00:01:12,337
It's It's been a whirlwind couple of years. Yeah, I mean, it feels like a bit of a whirlwind as well.
17
00:01:12,617 --> 00:01:15,657
You were out last week, as you mentioned. I'm out next week.
18
00:01:15,937 --> 00:01:22,657
So just kind of that run up to being out for the week has been kind of mind-boggling
19
00:01:22,657 --> 00:01:24,457
a little bit, a little bit of mental gymnastics.
20
00:01:25,817 --> 00:01:32,737
But I'd rather say that as opposed to telling you I had nothing to try to close out before I left.
21
00:01:33,177 --> 00:01:36,277
Well, you can time go by before I go on vacation. Yeah.
22
00:01:37,137 --> 00:01:38,957
But my seat's super warm.
23
00:01:41,037 --> 00:01:45,037
Spoiler alert, when you get back, 10 times worse than the prep to leave.
24
00:01:45,197 --> 00:01:49,657
I'll tell you that much. It's been hoey for me. Yeah, it's the luxury tax for going away.
25
00:01:49,997 --> 00:01:53,697
You prep so much before you leave the office.
26
00:01:53,977 --> 00:01:57,737
And then when you get back, it's like you pay the penalty for all the stuff
27
00:01:57,737 --> 00:02:02,077
that didn't get done, plus all the stuff that went awry, plus all the stuff
28
00:02:02,077 --> 00:02:04,497
that needs to get done in the future. But it's okay.
29
00:02:04,677 --> 00:02:09,017
It's worth connection and time with my family, going somewhere new,
30
00:02:09,217 --> 00:02:12,397
all of that. I'll take it. Yeah, same here. Same here. Absolutely.
31
00:02:12,997 --> 00:02:17,957
Let's talk about the conferences we both went to. So I went to FEMA Boston.
32
00:02:18,557 --> 00:02:23,757
It's the Financial Services Data Management Conference. It's been running around for a while now.
33
00:02:24,097 --> 00:02:28,437
I really enjoyed it. I actually met the conference producers leading up to the
34
00:02:28,437 --> 00:02:35,837
conference because I was going to emcee one of the tracks and run a couple of panels.
35
00:02:36,057 --> 00:02:40,697
So I moderated a couple of panels during the event. And what really got my attention
36
00:02:40,697 --> 00:02:43,937
was the agenda and the way they set it up.
37
00:02:43,957 --> 00:02:48,997
There's a lot of round discussions, a lot of opportunity for folks to engage
38
00:02:48,997 --> 00:02:51,537
and interact and learn from one another and network.
39
00:02:52,198 --> 00:02:55,298
It was probably one of the first conferences I've been to in a very long time
40
00:02:55,298 --> 00:03:00,358
where I really was able to meet folks from financial services that were really
41
00:03:00,358 --> 00:03:03,238
in the trenches trying to solve problems with data.
42
00:03:03,398 --> 00:03:07,558
One interesting takeaway from that event was one of the sessions I went to was
43
00:03:07,558 --> 00:03:12,358
this boardroom session, which is more of a workshop type session on Gen AI.
44
00:03:12,698 --> 00:03:16,038
I was astonished. The speaker, the person moderating the session,
45
00:03:16,218 --> 00:03:20,538
asked the room, who's working actively on Gen AI? No one raised their hands. Interesting.
46
00:03:21,258 --> 00:03:25,878
Yeah. Yeah, exactly. I was in shock. I was like, wait a minute,
47
00:03:25,918 --> 00:03:26,618
wait a minute, wait a minute.
48
00:03:26,758 --> 00:03:31,258
The moderator did a great job of outlining all the different areas we've seen
49
00:03:31,258 --> 00:03:32,938
in terms of use cases for Gen AI.
50
00:03:33,178 --> 00:03:38,398
Content summarization, being able to search content easily, being able to aggregate
51
00:03:38,398 --> 00:03:42,298
trends and data, being able to code migrations, those kinds of things,
52
00:03:42,338 --> 00:03:44,158
code facilitation use cases.
53
00:03:44,278 --> 00:03:46,358
There were some others, but no one in the room.
54
00:03:46,518 --> 00:03:49,858
Across all those use cases, there wasn't a single hand in that room.
55
00:03:49,938 --> 00:03:53,578
And there must have been 50 people in there. That's really shocking.
56
00:03:53,938 --> 00:03:59,018
You know, we talked a little bit a couple of weeks back about these shadow Gen AI.
57
00:03:59,814 --> 00:04:04,254
Organizations that are spinning up kind of under the covers at clients that we work with.
58
00:04:04,394 --> 00:04:10,954
So it's surprising that to hear that. Was there any key themes or drivers around why?
59
00:04:11,334 --> 00:04:17,434
What I walked away with was that data teams are not involved because the folks
60
00:04:17,434 --> 00:04:20,354
in the room were part of governance, data management, right?
61
00:04:20,654 --> 00:04:25,574
They're talking about it. They're trying to prepare for it, but it's potentially
62
00:04:25,574 --> 00:04:27,754
happening in other pockets within the organization.
63
00:04:27,754 --> 00:04:31,594
And it's funny because I saw a LinkedIn post this morning about who should own
64
00:04:31,594 --> 00:04:37,854
AI and your buddy Malcolm Walker responded saying that's no one should own AI.
65
00:04:38,054 --> 00:04:40,874
AI is going to be in the work stream.
66
00:04:41,034 --> 00:04:43,954
It's going to be in, you know, in the applications you purchase.
67
00:04:44,054 --> 00:04:47,714
It's going to be built into your work stream. One person shouldn't own it,
68
00:04:47,754 --> 00:04:48,994
but people should own data, right?
69
00:04:49,154 --> 00:04:51,994
Like I thought that was interesting and I agree with that perspective.
70
00:04:52,414 --> 00:04:56,834
So I walked away, all that kind of have informed my decision today to say,
71
00:04:56,934 --> 00:05:01,134
I have a feeling the folks in the room are just not involved in the solutioning for Gen AI.
72
00:05:01,314 --> 00:05:05,554
It's a different group. Totally makes sense to me as I think about it.
73
00:05:05,634 --> 00:05:08,474
But yeah, it was a great event. Lots of talk around governance.
74
00:05:08,774 --> 00:05:13,774
Yeah, I was really busy between the panels. The data democratization, that was the topic.
75
00:05:14,014 --> 00:05:18,374
So the panelists on that were incredible. And we ended up recording with FEMA,
76
00:05:18,574 --> 00:05:22,194
two separate podcasts on two separate topics with that panel.
77
00:05:22,194 --> 00:05:25,474
So that was a lot of fun as well, but it was a busy two days.
78
00:05:25,534 --> 00:05:28,814
Sounds like it. Sounds like it. And then you immediately hopped a plane,
79
00:05:28,974 --> 00:05:31,334
met me in New York with the next conference.
80
00:05:31,874 --> 00:05:34,914
Right. So we met at Data Universe, two-day conference there.
81
00:05:35,074 --> 00:05:37,834
It was their first time running the event in the United States.
82
00:05:37,874 --> 00:05:42,014
They have a sister event, same production company has a sister event in London.
83
00:05:42,374 --> 00:05:46,774
Big Data London, I believe. Correct. So they run Big Data London and Data Universe
84
00:05:46,774 --> 00:05:48,794
is their sister event here in the United States.
85
00:05:48,894 --> 00:05:52,694
This was the first time. It was an interesting space, wasn't it? Yeah, yeah.
86
00:05:52,794 --> 00:05:56,594
I mean, I've been to the Jacob Javits convention before for the big auto show.
87
00:05:56,794 --> 00:06:01,534
So to go there for something that that's a little bit more professionally driven was interesting.
88
00:06:02,515 --> 00:06:07,695
I think what was unique compared to some of the other conferences that I've
89
00:06:07,695 --> 00:06:13,095
been to over the last year, two years, was the fact that there were so many,
90
00:06:13,175 --> 00:06:15,335
like all the stages were in one place.
91
00:06:15,475 --> 00:06:22,235
So it wasn't separated by different rooms or different floors for presentations,
92
00:06:22,455 --> 00:06:25,355
but you were given headphones at each of the stages.
93
00:06:25,615 --> 00:06:30,915
I really enjoyed that. I would love to hear your thoughts on that multi-stage approach.
94
00:06:30,915 --> 00:06:33,795
Approach but I thought it allowed for a
95
00:06:33,795 --> 00:06:37,775
little bit more fluidity between different different
96
00:06:37,775 --> 00:06:41,095
talks as well as more networking
97
00:06:41,095 --> 00:06:44,195
I ran into a number of people that I've seen
98
00:06:44,195 --> 00:06:47,335
before spoken to before and just felt that
99
00:06:47,335 --> 00:06:50,035
that open concept allowed for a
100
00:06:50,035 --> 00:06:52,795
lot more connection with with people that
101
00:06:52,795 --> 00:06:55,715
we knew as well as new friends yeah I agree
102
00:06:55,715 --> 00:06:58,575
with the pros there I absolutely agree there was
103
00:06:58,575 --> 00:07:02,395
way more more networking that I think other layouts allow
104
00:07:02,395 --> 00:07:05,235
for because you spend so much time just walking from a
105
00:07:05,235 --> 00:07:08,275
like floor to floor area to area having it
106
00:07:08,275 --> 00:07:11,235
laid out the way they did in that open forum was really nice because
107
00:07:11,235 --> 00:07:14,095
you could always everyone was going back to the center where all the couches
108
00:07:14,095 --> 00:07:17,615
were and tables were to have a conversation if not that we're talking to the
109
00:07:17,615 --> 00:07:21,335
booths we I've seen more people at booths there than I did at other places because
110
00:07:21,335 --> 00:07:24,895
it was right there in your face you couldn't get away from it I really enjoyed
111
00:07:24,895 --> 00:07:28,815
that I think the downside having been on one of of the panels,
112
00:07:28,935 --> 00:07:32,975
as a panelist, the downside that I felt was the headphones.
113
00:07:33,315 --> 00:07:37,975
It did not really allow for interactivity in terms of the conversation you're trying to have.
114
00:07:38,215 --> 00:07:41,495
I think the only feedback I would give them is provide audience microphones
115
00:07:41,495 --> 00:07:46,055
for the Q&A at the end versus having the iPad question seem so anonymous.
116
00:07:46,455 --> 00:07:52,815
And I think the format in terms of the short hit time frame for the presentations
117
00:07:52,815 --> 00:07:56,295
could be a little challenging as well, because I think that the presentations
118
00:07:56,295 --> 00:07:59,575
ran about 20 minutes, which was really quick.
119
00:07:59,875 --> 00:08:03,235
Felt like we were just getting to the surface, barely scratching,
120
00:08:03,435 --> 00:08:07,315
whereas a little bit more time probably would have allowed for more depth and
121
00:08:07,315 --> 00:08:08,575
some of those conversations.
122
00:08:09,075 --> 00:08:12,215
Yeah, no, I totally agree. But there were a lot of great speakers there.
123
00:08:12,475 --> 00:08:19,475
I met the head of AI at Airbnb and heard her story. And that was like super impressive.
124
00:08:19,655 --> 00:08:22,455
It was funny because I met her on the couch before the event.
125
00:08:22,575 --> 00:08:25,175
I got there super early that day. It was the second day of the event.
126
00:08:25,295 --> 00:08:27,435
I got there very early, grabbed some coffee.
127
00:08:27,695 --> 00:08:30,915
There was maybe me and one other person there. And then all of a sudden,
128
00:08:30,935 --> 00:08:34,495
I'm looking around, it was all women. All the women showed up early. It was fascinating.
129
00:08:34,755 --> 00:08:39,935
So we sat on the couch and chit-chatted. I met a number of very interesting people there.
130
00:08:40,095 --> 00:08:43,835
So really, really cool layout for networking for sure. For sure.
131
00:08:43,995 --> 00:08:47,455
Looking forward to see how that evolves next year. Yeah, for sure.
132
00:08:47,535 --> 00:08:53,315
I mean, they had a very kind of wide-ranging set of agenda topics as well. Thank you.
133
00:08:53,540 --> 00:08:59,820
But kind of as a counterpoint to your FEMA experience, there were a lot of sessions
134
00:08:59,820 --> 00:09:04,560
on AI and Gen AI at this particular conference. I think I know why.
135
00:09:06,900 --> 00:09:11,240
I hate to say this because I appreciate it. I get a lot out of it.
136
00:09:11,520 --> 00:09:16,220
But I think some of these conference coordinators need to keep an eye out.
137
00:09:16,300 --> 00:09:19,980
There are a lot of data influencers out there these days. I mean,
138
00:09:20,000 --> 00:09:24,800
when I see data influencers creating companies to literally become data influencers,
139
00:09:25,080 --> 00:09:27,440
I get concerned because now you're just talking to talk.
140
00:09:27,720 --> 00:09:31,760
That's concerning to me in terms of a marketplace where technology is moving
141
00:09:31,760 --> 00:09:34,300
really fast to see that happening.
142
00:09:34,400 --> 00:09:37,940
Because the reality is the messages that are being put out there,
143
00:09:38,020 --> 00:09:41,180
like Gen AI is going to solve all world problems, is tough.
144
00:09:41,380 --> 00:09:45,420
It's tough to hear that. because I know on the ground that is not happening
145
00:09:45,420 --> 00:09:47,880
and there's still very large problems to solve.
146
00:09:48,080 --> 00:09:51,080
The Gen AI is not going to be the magic bullet for us. So yeah,
147
00:09:51,220 --> 00:09:54,300
I think that was kind of the disappointing aspects there.
148
00:09:54,480 --> 00:09:59,400
But I'm hoping that conference companies, organizations do what FEMA did.
149
00:09:59,560 --> 00:10:05,120
Like literally, if you're not part of an organization, a company and have a
150
00:10:05,120 --> 00:10:07,380
case study, you don't get to be in the room.
151
00:10:07,560 --> 00:10:11,520
I want organizations doing that more and more so that more of these real life
152
00:10:11,520 --> 00:10:13,380
use cases actually get on the table.
153
00:10:13,741 --> 00:10:20,321
Yeah, exactly. Make it real. Make it something that we can then look at and
154
00:10:20,321 --> 00:10:22,681
say, how do we embed this in our organization?
155
00:10:23,001 --> 00:10:27,181
How do we replicate that level of success? Similar fashion. Right.
156
00:10:27,421 --> 00:10:31,941
And then they wonder why clients aren't there. Well, clients aren't learning from other clients.
157
00:10:32,141 --> 00:10:36,161
You want the local, you know, CPD company or the manufacturing company,
158
00:10:36,241 --> 00:10:39,461
they're not going to show up to an event where they don't see themselves on stage.
159
00:10:39,701 --> 00:10:43,181
You want to see other organizations solving the problems that you're trying
160
00:10:43,181 --> 00:10:44,681
to solve and you want to learn from them.
161
00:10:44,781 --> 00:10:49,321
And if you're just hearing from, you know, talking heads or product vendors
162
00:10:49,321 --> 00:10:53,661
or even I'll even say consulting firms, that's a challenge. Yeah,
163
00:10:53,801 --> 00:10:54,981
I'm gonna eat my words on that later.
164
00:10:55,081 --> 00:10:57,581
But for now, that's where that's where I'm staying.
165
00:10:58,741 --> 00:11:02,161
You're cringing over there. I love it. You're like, Oh, no, no,
166
00:11:02,161 --> 00:11:08,261
because my mind's now racing going, Okay, so who are we bringing with us to
167
00:11:08,261 --> 00:11:11,101
really talk about a meaty success story?
168
00:11:11,341 --> 00:11:15,961
So we aren't that consultancy that's just, yeah, talking about what we could
169
00:11:15,961 --> 00:11:21,141
do, but really anchoring on what we have done, and what benefit it's brought
170
00:11:21,141 --> 00:11:22,921
to brought to an organization. Yeah.
171
00:11:23,041 --> 00:11:26,421
And, you know, I say that kind of tongue in cheek because obviously I was on
172
00:11:26,421 --> 00:11:28,701
a panel. That entire panel was consultants.
173
00:11:28,941 --> 00:11:34,081
We kind of turned the conversation on its side where the moderator acted like
174
00:11:34,081 --> 00:11:37,701
a client and we're trying to convince this client what this new terminology
175
00:11:37,701 --> 00:11:39,081
was of data intelligence.
176
00:11:39,481 --> 00:11:43,061
So it was a great conversation. I definitely got a lot of feedback from it.
177
00:11:43,061 --> 00:11:47,381
It seemed like it's a, you know, with newer topics where there aren't too many
178
00:11:47,381 --> 00:11:51,061
case studies, where there aren't too many folks who actually have done it or
179
00:11:51,061 --> 00:11:53,541
can you want a different perspective? Yeah, that makes sense.
180
00:11:53,781 --> 00:11:59,001
But to do that all time for the entire conference, that's a challenge. Yeah, I agree.
181
00:11:59,341 --> 00:12:02,561
That's why I called out the Airbnb one because it was fascinating.
182
00:12:02,881 --> 00:12:09,301
It was a real use case. It was about blockchain and Gen AI, a topic I had not even thought of.
183
00:12:09,401 --> 00:12:14,221
So that was super Super fascinating to me to hear that case study and to understand the application.
184
00:12:14,641 --> 00:12:18,581
Awesome. So we're going to, we're a couple of minutes away from Rachel Workman.
185
00:12:20,541 --> 00:12:23,261
Hey, Rachel. Hey, that was fun.
186
00:12:23,608 --> 00:12:29,708
That works. I was like, oh my goodness, I'm going to fail on the very first step, the technology.
187
00:12:30,268 --> 00:12:34,988
No, no. That was my sending you the wrong link trick. It works every time.
188
00:12:36,188 --> 00:12:40,828
You're like, we'll start with panic and then it'll be all uphill from there.
189
00:12:40,928 --> 00:12:43,448
Exactly. A little bit of panic helps.
190
00:12:44,068 --> 00:12:48,508
You get your heart pumping. You got to be ready to go and excited to be.
191
00:12:48,888 --> 00:12:50,588
Hopefully that did it for you.
192
00:12:51,448 --> 00:12:56,588
Sure. How are you today, Anjali? Pretty good. Pretty good. We're leaving for vacation tomorrow.
193
00:12:56,788 --> 00:13:00,508
So I'm starting to get into vacation mindset.
194
00:13:00,868 --> 00:13:04,448
So you get to that edge where you're like, I'm going to be free soon.
195
00:13:05,068 --> 00:13:06,188
You're never really free.
196
00:13:09,468 --> 00:13:12,988
See, I'm not gonna have to worry about my existentialness at all.
197
00:13:13,668 --> 00:13:15,328
You guys share that trait.
198
00:13:16,108 --> 00:13:20,448
Well, Sandy and I were actually talking about the luxury tax that you pay when
199
00:13:20,448 --> 00:13:22,108
you get back from vacation, right?
200
00:13:22,228 --> 00:13:27,328
It's almost like you're already tired from this amazing vacation that you had,
201
00:13:27,428 --> 00:13:33,128
but now all of a sudden there's this pile that you just have to start working through.
202
00:13:33,548 --> 00:13:38,648
And it's just almost like you're paying a penalty for having gone away for a little bit. You are.
203
00:13:38,848 --> 00:13:42,208
Like, I agree with you completely. It's not almost, it's you are because the
204
00:13:42,208 --> 00:13:46,688
work like doesn't go away and no people will cover for you on like really,
205
00:13:46,748 --> 00:13:49,008
really critical stuff, but they don't have time.
206
00:13:49,008 --> 00:13:52,108
Nobody has time to like pick up your work so
207
00:13:52,108 --> 00:13:55,268
all you've done is condense your work into smaller time period right
208
00:13:55,268 --> 00:14:01,528
i'm still a fan of vacation and right but the work didn't go anywhere that's
209
00:14:01,528 --> 00:14:09,488
hilarious i'm definitely a fan of the case i take way too much so rachel i actually
210
00:14:09,488 --> 00:14:14,288
did a little digging to figure out when did we meet i was wondering that Yeah,
211
00:14:14,368 --> 00:14:17,068
it was February 2017, in case you were wondering.
212
00:14:17,308 --> 00:14:23,768
So it's been a while. Seven years? Yeah, seven years now. And we engaged for about two months.
213
00:14:24,048 --> 00:14:28,528
I think I would have put it in time, but I can't believe we only engaged for like two months.
214
00:14:28,828 --> 00:14:31,608
Two months. It felt so much longer.
215
00:14:34,068 --> 00:14:38,208
I don't know if that's because of me or because of the product or because of
216
00:14:38,208 --> 00:14:41,848
the situation at the organization, which shall we name?
217
00:14:42,188 --> 00:14:46,668
Nameless. Yeah, it's really all like you were the bright spot in that project.
218
00:14:46,908 --> 00:14:50,808
I felt like you were one of the lone voices of sanity.
219
00:14:51,028 --> 00:14:54,268
You know what it is when you're in a project and the words come out of your
220
00:14:54,268 --> 00:14:57,468
mouth and you look around and you're like, no, nobody has an idea what I'm talking about.
221
00:14:57,568 --> 00:15:00,568
Everybody says things and it all sounds like it's like Peanuts characters.
222
00:15:00,728 --> 00:15:03,188
It's like you're like, nobody's making any sense. And then somebody makes sense.
223
00:15:03,228 --> 00:15:06,068
And you're like, this person makes sense. So it does. It like it leaves.
224
00:15:06,450 --> 00:15:11,050
This imprint in your brain. So I felt very much like that experience.
225
00:15:11,230 --> 00:15:13,690
Well, I appreciate that. I appreciate you. And quite frankly,
226
00:15:13,790 --> 00:15:18,130
I was surprised too. I was like, oh, it's two months. You left an impression on me, obviously.
227
00:15:18,310 --> 00:15:21,150
I think you and I have been reaching out to one another over the years.
228
00:15:21,290 --> 00:15:22,890
So it's mutual was my point.
229
00:15:22,970 --> 00:15:27,990
But at the time, you were the head of operations, the head of customer success.
230
00:15:28,270 --> 00:15:31,950
It seems like you had multiple roles there. Yeah, 2017.
231
00:15:32,550 --> 00:15:37,190
I think I was still over North American customer success and services.
232
00:15:37,650 --> 00:15:40,490
So I've got to know you for the past seven years. Maybe we can spend a little
233
00:15:40,490 --> 00:15:44,910
time educating Anjali and our listeners in terms of your past and who you are. Sure.
234
00:15:45,070 --> 00:15:51,810
So I'd love to. Rachel Workman. I currently am the head VP of data at a series
235
00:15:51,810 --> 00:15:55,670
A data platform startup company, SoundCommerce.
236
00:15:55,870 --> 00:15:59,370
I'm immersed in this field every single day. I'm doing cool things.
237
00:15:59,530 --> 00:16:03,970
My company built a tool that basically makes pipelines, building pipelines,
238
00:16:04,070 --> 00:16:05,490
more accessible and flexible.
239
00:16:05,670 --> 00:16:08,510
So the whole, you're playing in the whole low-code, no-code space,
240
00:16:08,710 --> 00:16:14,270
more flexible for how you shape data, pull semantic modeling in stream,
241
00:16:14,430 --> 00:16:17,290
as opposed to, and data at rest in the data warehouse,
242
00:16:17,510 --> 00:16:21,970
lands that data in maybe a more usable format, comes to like time to value and
243
00:16:21,970 --> 00:16:24,170
avoidance, cost avoidance of some of the data processing.
244
00:16:24,170 --> 00:16:28,430
So super cool things to be working on in the data space.
245
00:16:28,690 --> 00:16:32,870
It is my obsession, my passion, and my life in that space.
246
00:16:33,150 --> 00:16:37,730
But as you know, because you met me during that time, my path here was anything
247
00:16:37,730 --> 00:16:42,350
but conventional. If I look back, I started out, I started out on the right path.
248
00:16:42,590 --> 00:16:46,450
Well, how biased am I? One of my first jobs in grad school was database programmer
249
00:16:46,450 --> 00:16:51,390
for a high volume shipping system that, you know, built databases through SQL server.
250
00:16:51,710 --> 00:16:54,810
I think that might be where my love of this might have been born.
251
00:16:54,950 --> 00:16:58,690
You want the whole dinosaur story. We also build in bb.net. Yeah, that.
252
00:16:59,068 --> 00:17:03,208
Played me in time, but it wasn't too long after that, that I moved into,
253
00:17:03,268 --> 00:17:06,708
I wanted to see the business side, understand really more about why we were
254
00:17:06,708 --> 00:17:07,788
building the things we were building.
255
00:17:07,888 --> 00:17:11,368
And I moved into management consulting and then it's services leadership.
256
00:17:11,668 --> 00:17:14,868
And I spent the preponderance of my career on the business side,
257
00:17:14,928 --> 00:17:19,348
solving things from a business standpoint and running PNLs and all of those things.
258
00:17:19,408 --> 00:17:24,108
It was always tertiary to data and analytics as in every company I worked for,
259
00:17:24,128 --> 00:17:28,068
I always worked for software companies, had data products or analytics product.
260
00:17:28,228 --> 00:17:31,848
And so it was never too far, but definitely not on the technical side.
261
00:17:31,948 --> 00:17:35,288
And then really wasn't until age started to take over and you start asking yourself
262
00:17:35,288 --> 00:17:39,848
the questions of like, am I really doing the things I love that I gravitated back towards data?
263
00:17:39,868 --> 00:17:44,408
It was right around when you and I met that I was getting my data science master's
264
00:17:44,408 --> 00:17:49,448
big old leap after that, which is, well, let's go see who's going to buy into
265
00:17:49,448 --> 00:17:51,668
the fact that I can run on that side fully.
266
00:17:51,668 --> 00:17:57,448
And luckily, a great startup, Outlier AI, did and let me spend a couple years
267
00:17:57,448 --> 00:18:04,108
mired in the modern technologies and really hands-on stuff and my time preparing
268
00:18:04,108 --> 00:18:09,248
data from all different kinds of industries and companies for time series modeling.
269
00:18:10,048 --> 00:18:13,688
And I haven't really looked back since. That's my origin story.
270
00:18:14,028 --> 00:18:19,508
Such an interesting story. And I definitely want to dive into kind of your focus
271
00:18:19,508 --> 00:18:22,288
today and talking about data.
272
00:18:22,428 --> 00:18:27,408
But before we get into that, I'm just kind of curious, like what keeps you busy
273
00:18:27,408 --> 00:18:29,808
outside of the exciting world of data?
274
00:18:30,328 --> 00:18:34,328
Yes. So I am the mom to an 11-year-old boy.
275
00:18:34,488 --> 00:18:39,608
So that is the primary thing that keeps me busy.
276
00:18:39,608 --> 00:18:44,888
You know, as most parents know, your life revolves around things like common
277
00:18:44,888 --> 00:18:50,228
core math and mastering YouTube videos of common core math so that you can help
278
00:18:50,228 --> 00:18:51,568
with math worksheets and,
279
00:18:51,648 --> 00:18:56,088
you know, practicing spelling words of words that you thought you knew how to spell,
280
00:18:56,188 --> 00:18:58,368
but you look at them and go, maybe I don't know how to spell.
281
00:18:58,428 --> 00:19:04,428
How many M's are in commemorate? Right. So a lot of my time is taken up with those things.
282
00:19:04,528 --> 00:19:11,288
But when I am not basically revolving my life around him, I have two dogs, love them.
283
00:19:11,608 --> 00:19:15,848
My one dog's my jogging partner. I was going to say running partner,
284
00:19:15,968 --> 00:19:19,588
but I think my days of running are past. Definitely jogging territory now.
285
00:19:19,748 --> 00:19:25,348
I like to hike and I still love, as I have since probably the day I picked up
286
00:19:25,348 --> 00:19:27,528
my first book, Fantasy and Science Fiction.
287
00:19:27,608 --> 00:19:34,588
So read a lot of that. Well, read and listen to audiobooks because I could do that while I'm jogging.
288
00:19:35,888 --> 00:19:39,108
Combining multiple loves all at once, right? You got to be efficient, right?
289
00:19:39,768 --> 00:19:43,548
Well, being a busy mom, it's the efficiency, right? Right. I'm curious.
290
00:19:43,688 --> 00:19:48,908
I'm really into science fiction, mostly consuming it on media television.
291
00:19:49,268 --> 00:19:52,488
But is there a science fiction book that you're reading right now that you would
292
00:19:52,488 --> 00:19:54,028
recommend or a recent one?
293
00:19:55,664 --> 00:19:59,204
So recommending, I would recommend everything I read. Generally,
294
00:19:59,264 --> 00:20:03,324
I've actually mastered the art of putting a book down that I don't like.
295
00:20:03,404 --> 00:20:06,584
That took me a long time, but I will. If I don't like it, I'll put it down.
296
00:20:06,664 --> 00:20:07,864
So anything I read, I would recommend.
297
00:20:08,284 --> 00:20:15,024
But I definitely have interesting tastes. So right now I'm reading the Crossbreed
298
00:20:15,024 --> 00:20:17,004
series from Danica Dark.
299
00:20:17,224 --> 00:20:24,724
So I tend to gravitate towards strong female characters, usually in some type
300
00:20:24,724 --> 00:20:27,844
of urban fantasy type novels.
301
00:20:28,244 --> 00:20:31,944
So she's definitely somebody I've read a lot, and I think she does a good job.
302
00:20:32,264 --> 00:20:35,264
And I have to mention Shannon Mayer. I won't go on forever because,
303
00:20:35,404 --> 00:20:37,584
I mean, a list of authors would be hundreds long.
304
00:20:37,764 --> 00:20:40,704
But I think she would be a favorite of mine.
305
00:20:40,984 --> 00:20:44,204
And also, again, really strong female characters.
306
00:20:44,264 --> 00:20:47,624
But she does a 40-proof series, which is,
307
00:20:47,684 --> 00:20:53,104
you know, her heroine is, you know, across the 40 and has to learn how to,
308
00:20:53,144 --> 00:21:00,064
like, be a hero and not be 20 and tough anymore. So maybe that's closer to my heart.
309
00:21:00,724 --> 00:21:05,444
That sounds fascinating. We'd have to check out the 40-plus heroine stories.
310
00:21:06,304 --> 00:21:12,384
It takes some inspiration, I think. But so, Rachel, you talked about kind of
311
00:21:12,384 --> 00:21:14,264
your your journey and your career
312
00:21:14,264 --> 00:21:19,784
path and how you went from data to the business and then back to data.
313
00:21:19,924 --> 00:21:24,544
And in that transition, were there any surprises now that you're back on the data?
314
00:21:24,544 --> 00:21:30,504
Yeah, I think that some of the biggest surprises that I had going from side
315
00:21:30,504 --> 00:21:35,224
to side like that is how much of a goal there is,
316
00:21:35,404 --> 00:21:41,944
like how big the chasm is and how with all the maturing of business strategies
317
00:21:41,944 --> 00:21:49,404
and all of that, that we really haven't made tremendous stride in closing that chasm so much.
318
00:21:49,404 --> 00:21:53,524
Or as much as I think maybe we have the opportunity to.
319
00:21:53,784 --> 00:21:58,224
Having worked on both sides, not only are the languages just,
320
00:21:58,324 --> 00:22:00,884
and I'm not talking about programming languages or anything,
321
00:22:00,964 --> 00:22:04,084
but the language you speak, the vernacular you speak, the concepts that you
322
00:22:04,084 --> 00:22:09,304
ideate on and really rally around are just very different.
323
00:22:09,384 --> 00:22:12,244
But the biases are super strong.
324
00:22:12,464 --> 00:22:17,344
You see it surface in memes and stuff on LinkedIn where somebody else,
325
00:22:17,524 --> 00:22:19,164
and they all make us laugh.
326
00:22:19,404 --> 00:22:22,524
Right? So we know there's truth in them. Like where somebody's like,
327
00:22:22,604 --> 00:22:23,664
hey, can you throw that data to.
328
00:22:24,320 --> 00:22:27,220
For me real quick and you know stuff like that and you're like oh yeah
329
00:22:27,220 --> 00:22:29,960
that's a that's a week long task that you just
330
00:22:29,960 --> 00:22:33,000
think it's going to take five minutes and and vice versa where
331
00:22:33,000 --> 00:22:35,940
you know the the business side is like i really don't understand
332
00:22:35,940 --> 00:22:39,220
how it could take you that long to get to this number there's at least been
333
00:22:39,220 --> 00:22:43,660
acceptance in the last you know half decade to decade that we all like there's
334
00:22:43,660 --> 00:22:48,580
no solve to this problem other than like we all work together but i don't i
335
00:22:48,580 --> 00:22:52,260
i was just super surprised to see that we're still so far from that yeah i mean
336
00:22:52,260 --> 00:22:53,640
as you said there's no simple
337
00:22:53,840 --> 00:22:58,180
solve to this problem besides collaboration and communication and openness.
338
00:22:58,480 --> 00:23:01,680
And I mean, without that, you might as well just pack up and go home.
339
00:23:01,720 --> 00:23:05,900
We spend all these times and, and by the way, I'm a fan of some of the things that I'm mentioning,
340
00:23:06,060 --> 00:23:09,480
like I'm a fan of thinking of things from conceptual or framework things like
341
00:23:09,480 --> 00:23:13,560
data mesh and stuff like that, and different ways that you can build out organizations
342
00:23:13,560 --> 00:23:16,740
and collaboration strategies and all this to aid in these things.
343
00:23:16,800 --> 00:23:21,040
But we spend so much time thinking about those things, conceiving them,
344
00:23:21,160 --> 00:23:23,560
articulating them, and braiding them down and stuff like that.
345
00:23:23,700 --> 00:23:27,600
And we kind of skip over the whole like work together part of it.
346
00:23:27,660 --> 00:23:33,800
But I feel like it's really weird because the business world is almost the antithesis of academic.
347
00:23:34,060 --> 00:23:38,340
But at the same time, we take these colossal academic approaches to like,
348
00:23:38,400 --> 00:23:42,440
let's define a whole new way of working as opposed to like, let's sit down and
349
00:23:42,440 --> 00:23:44,180
solve this problem together.
350
00:23:44,380 --> 00:23:48,520
Right. Which should be the new way of working, right? Let's collaborate and
351
00:23:48,520 --> 00:23:49,540
drive that forward. word.
352
00:23:49,660 --> 00:23:53,080
But I guess, you know, as you think about that chasm, right,
353
00:23:53,140 --> 00:23:57,920
the wider that chasm has gotten, I've experienced at least, is the wider the
354
00:23:57,920 --> 00:24:02,580
chasm, the more friction between the data teams and the business.
355
00:24:02,980 --> 00:24:08,360
So how are you navigating that friction and that chasm in your daily life?
356
00:24:08,660 --> 00:24:14,780
Yeah, that's not a simple question at all, because that, you know,
357
00:24:14,780 --> 00:24:18,520
the chasm is real, and the vocabularies are not the same.
358
00:24:18,680 --> 00:24:23,720
And even the slant on the same vocabularies aren't the same.
359
00:24:23,880 --> 00:24:28,320
So, and I'll give you an example. Data trust is super important to data teams
360
00:24:28,320 --> 00:24:30,960
and data trust is super important to the business, right?
361
00:24:31,160 --> 00:24:34,740
But if you're on the data team side and you're talking about data trust,
362
00:24:34,820 --> 00:24:40,260
you're going to talk about things like fidelity to source or observability or
363
00:24:40,260 --> 00:24:45,340
visible lineage or complete lineage or complete and transferable metadata. of data.
364
00:24:45,420 --> 00:24:47,020
And you're going to think about these things and you're going to think about
365
00:24:47,020 --> 00:24:55,340
the pieces and parts that give you trust that the data is accurate to your specifications. But
366
00:24:55,738 --> 00:24:59,638
You may be okay with things like 1% variances in certain things,
367
00:24:59,718 --> 00:25:00,558
right? And stuff like that.
368
00:25:00,678 --> 00:25:03,978
And you're going to maybe think of things like mathematically or statistically,
369
00:25:04,098 --> 00:25:06,258
but you're going to be okay with parameters like that.
370
00:25:06,358 --> 00:25:10,238
If you immerse yourself fully on the business side, you walk into a conversation
371
00:25:10,238 --> 00:25:11,678
that has similar things.
372
00:25:12,118 --> 00:25:15,758
You know, people are going to want things like clear definition of metrics,
373
00:25:15,978 --> 00:25:18,198
right? You know, how did you build this? What's the math?
374
00:25:18,378 --> 00:25:21,078
You know, show me the math, show me the path. Like, where did it come from?
375
00:25:21,158 --> 00:25:23,138
Show me, you know, how do you create that?
376
00:25:23,178 --> 00:25:27,678
And what's the math behind it? But I've seen whole conversations just completely
377
00:25:27,678 --> 00:25:32,898
implode when somebody's, oh, it's within the tolerance, the air tolerance.
378
00:25:32,958 --> 00:25:34,218
And the business side is like, what?
379
00:25:34,438 --> 00:25:37,098
Like, what's missing? Why is it missing?
380
00:25:37,418 --> 00:25:40,598
Where is it going? And how do I know that it's not something really important
381
00:25:40,598 --> 00:25:42,938
that's missing? Now everything's suspect, right?
382
00:25:43,278 --> 00:25:47,438
And difference in attitude that seems to cause a lot of challenges.
383
00:25:47,438 --> 00:25:52,178
Challenges and navigating that, I have found some success or at least learned
384
00:25:52,178 --> 00:25:53,998
some things about trying to
385
00:25:53,998 --> 00:25:57,038
avoid different trigger words that come off as kind of flippant like that.
386
00:25:57,218 --> 00:26:01,458
It's important if there's a rattle in your engine and you're driving your car
387
00:26:01,458 --> 00:26:05,178
down the road and you have no idea how engines work, you're going to be pretty scared.
388
00:26:05,338 --> 00:26:07,758
But if you're a mechanic and you're like, that's the heat panel.
389
00:26:07,878 --> 00:26:08,818
You're like, I'm driving along
390
00:26:08,818 --> 00:26:10,978
with my stepdad. He's like, that's the heat panel. Don't worry about that.
391
00:26:11,038 --> 00:26:14,338
You're fine. And I'm like, yeah, really? Am I? Is it going to blow up?
392
00:26:14,598 --> 00:26:18,958
Because I don't feel fine. That's a weird noise. And those types of like really
393
00:26:18,958 --> 00:26:23,778
relatable things are what's happening in these this type of dynamic between these teams.
394
00:26:23,798 --> 00:26:26,978
Again, to the thing that we seem to throw away super easily,
395
00:26:27,098 --> 00:26:31,498
which is the human connection of really trying to understand each other a little bit better.
396
00:26:31,618 --> 00:26:36,218
Yeah, I find that if you throw the jargon out the door and meet somebody where
397
00:26:36,218 --> 00:26:40,938
they're at and try to understand how do you think about this and what are the
398
00:26:40,938 --> 00:26:43,618
things that matter matter to you as an individual.
399
00:26:43,878 --> 00:26:48,218
It's hard as a group, but I think if that individual is a representative of
400
00:26:48,218 --> 00:26:51,638
the group, their peers, you can kind of try to get an understanding from at
401
00:26:51,638 --> 00:26:55,098
least that individual in terms of how they think, what is it that they're worried about?
402
00:26:55,318 --> 00:26:57,918
What is their tolerance level in business terms?
403
00:26:58,158 --> 00:27:02,278
And try to set that all aside ahead of time so that you kind of know where you're
404
00:27:02,278 --> 00:27:05,658
going to go as you engage with them and as you move forward.
405
00:27:05,878 --> 00:27:09,458
And I find that a lot of people miss that step. They go in and they immediately
406
00:27:09,458 --> 00:27:13,838
engage, assuming that they're on the same page, assuming that they understand
407
00:27:13,838 --> 00:27:15,058
what's important to one another.
408
00:27:15,218 --> 00:27:19,958
And that's a complete misstep for most people who are in this world.
409
00:27:20,018 --> 00:27:22,658
And that's why the chasm just gets bigger and wider.
410
00:27:23,096 --> 00:27:25,556
Because everyone's making assumptions about the other side of it.
411
00:27:25,636 --> 00:27:30,496
Yeah, the spectacular miscommunications that come from those assumptions.
412
00:27:30,636 --> 00:27:33,736
One of the biggest ones that I have seen, and again,
413
00:27:33,936 --> 00:27:39,376
you know my bias from being on the business side so much of my career and now
414
00:27:39,376 --> 00:27:41,176
being completely immersed in the data side,
415
00:27:41,296 --> 00:27:47,796
is the assumption that the business side somehow doesn't understand maybe the
416
00:27:47,796 --> 00:27:50,496
math of the situation as well.
417
00:27:50,496 --> 00:27:55,816
It's the rare individual on the business side that sits down and really likes
418
00:27:55,816 --> 00:27:59,036
to actually work out the math of a gradient regression.
419
00:27:59,716 --> 00:28:06,156
That's things that maybe a few amount of people like to do. That doesn't mean that they can't.
420
00:28:06,696 --> 00:28:10,856
Once they sit down and talk to one another, they're surprised that there's people
421
00:28:10,856 --> 00:28:14,176
on the business side that can work out supply chain forecasts in their head
422
00:28:14,176 --> 00:28:19,176
and really understand very complicated concepts at a very detailed level.
423
00:28:19,176 --> 00:28:23,876
And there's a lot more commonality there than you think there is.
424
00:28:23,976 --> 00:28:26,796
It's just coming from different angles. Yeah, absolutely.
425
00:28:27,136 --> 00:28:32,616
It was funny because I actually met someone who once said, he said that companies
426
00:28:32,616 --> 00:28:34,456
don't understand basic economics.
427
00:28:34,656 --> 00:28:39,036
And I'm sitting there going, wait, most people I meet who are on the business
428
00:28:39,036 --> 00:28:43,436
side either have economics background or an engineering background,
429
00:28:43,496 --> 00:28:47,956
like C-level executives I've met who have engineering backgrounds and they're business executives.
430
00:28:48,036 --> 00:28:51,516
And I'm sitting there going, do not dismiss people because of their title.
431
00:28:51,616 --> 00:28:53,396
They know more than people think.
432
00:28:53,596 --> 00:29:00,936
And I also think that reality is most of these concepts can be distilled down to very basic terms.
433
00:29:01,176 --> 00:29:06,296
Yeah. Yeah. And I think part of it is also around language, right? Language matters.
434
00:29:06,756 --> 00:29:12,556
And so one of the things that we've seen happening quite frequently is this movement of resources.
435
00:29:12,876 --> 00:29:18,936
So as one priority comes up, we're moving our people from, you know,
436
00:29:18,936 --> 00:29:23,276
from priority one to priority two, and they're taking the language and learnings
437
00:29:23,276 --> 00:29:27,236
that made them successful initially to this new role,
438
00:29:27,356 --> 00:29:28,936
new priority, new shiny object.
439
00:29:29,256 --> 00:29:31,176
So are you encountering that today?
440
00:29:31,736 --> 00:29:35,296
Yes. I, and I, I think that I would have answered the question,
441
00:29:35,436 --> 00:29:40,796
the, that question a little bit differently before Gen AI became a thing.
442
00:29:40,876 --> 00:29:44,736
Now, something that like keeps me awake at night, there's always been a bit
443
00:29:44,736 --> 00:29:46,536
of whole, like shiny new thing.
444
00:29:46,696 --> 00:29:50,036
And I think there's a couple of things that personally in my life have changed
445
00:29:50,036 --> 00:29:53,956
that maybe put me in closer into a group that does that more,
446
00:29:54,076 --> 00:29:57,596
you know, which is the startup culture and Silicon Valley.
447
00:29:57,756 --> 00:30:01,616
There's things you can't ignore. People give money to shiny new things,
448
00:30:01,776 --> 00:30:05,376
you know, more than they give money to boring old things.
449
00:30:05,476 --> 00:30:09,136
And that it is what it is. But there's a real cost to that.
450
00:30:09,416 --> 00:30:11,556
Context switching has a cost.
451
00:30:11,956 --> 00:30:16,956
It is a mental tax on the people who her context switching and it has efficiency
452
00:30:16,956 --> 00:30:19,456
cost and it has a tech debt cost.
453
00:30:19,910 --> 00:30:25,170
On things that are basically done short so that you can move on to that new thing.
454
00:30:25,370 --> 00:30:31,050
And all those things need to be talked about and honored within the context
455
00:30:31,050 --> 00:30:36,270
of what we've been talking about, which is that gulf between the business side and the data side,
456
00:30:36,410 --> 00:30:42,890
making sure that we really use our voices and try to articulate the cost of
457
00:30:42,890 --> 00:30:47,670
making those moves and the risk and find the right words.
458
00:30:47,890 --> 00:30:53,390
I see it as like a tug of war Never give up. Just always keep pulling and don't
459
00:30:53,390 --> 00:30:54,850
lose your ground on those things.
460
00:30:55,030 --> 00:31:02,050
Because the cost of allowing that to happen, especially if you hold innate knowledge
461
00:31:02,050 --> 00:31:03,970
of possibly can go wrong.
462
00:31:04,070 --> 00:31:08,770
And that's why I said Jenny and I kind of hold a scar that I think every single
463
00:31:08,770 --> 00:31:13,670
one of us who works in data every single day watched this whole LLM thing unfold
464
00:31:13,670 --> 00:31:16,350
going, what about the data?
465
00:31:16,350 --> 00:31:21,650
And all of us, it was almost like watching a slow-moving accident unfold.
466
00:31:22,490 --> 00:31:27,050
We all knew there was so much excitement and so much momentum.
467
00:31:27,070 --> 00:31:31,230
And I feel personally, in my circle, we just never found the words.
468
00:31:31,310 --> 00:31:34,690
As we all have found out in business, I told you, so it never matters.
469
00:31:34,870 --> 00:31:38,070
By the time you get to it, I told you, so you've lost the battle, right?
470
00:31:38,230 --> 00:31:43,070
If I look at that from a lessons learned standpoint of how we can learn to work
471
00:31:43,070 --> 00:31:47,210
together better, or like finding those words to articulate and not bore the
472
00:31:47,210 --> 00:31:50,970
people who are excited, you move the resources to the shiny new thing.
473
00:31:51,110 --> 00:31:56,490
You might have a super cool prototype that is flashy and everybody loves.
474
00:31:56,850 --> 00:32:00,970
And if the purpose of that, if the objective of that was to get like an investment
475
00:32:00,970 --> 00:32:02,330
or something, that's fine.
476
00:32:02,610 --> 00:32:08,850
If your purpose was to productionalize, was to get a business goal and business
477
00:32:08,850 --> 00:32:14,730
value out of it, you moved resources from the thing that was building the underpinnings
478
00:32:14,730 --> 00:32:16,870
of it. And now you've made your path longer.
479
00:32:16,990 --> 00:32:20,810
And being able to have those discussions is super important.
480
00:32:21,170 --> 00:32:26,130
Yeah, I was just gonna say, I find that to be the case regardless of whether
481
00:32:26,130 --> 00:32:31,930
it's JAI or any other new tech that comes along usually or concept or issue
482
00:32:31,930 --> 00:32:33,770
that an organization is trying to address.
483
00:32:34,410 --> 00:32:39,710
Orgs still have not gotten smart to the fact that you still have to run your operations.
484
00:32:39,710 --> 00:32:45,210
Innovations and if you're if you're going to do r&d innovation that should be
485
00:32:45,210 --> 00:32:49,650
a completely different arm of your business even in a data it should be separate
486
00:32:49,650 --> 00:32:53,090
because then you can make those calls and say all right we were going to innovate
487
00:32:53,090 --> 00:32:55,450
on x here was investment we made
488
00:32:55,865 --> 00:33:00,425
If we're going to shove that, we already have a sunk cost on that investment with that team.
489
00:33:00,605 --> 00:33:04,665
Do we spin up a secondary team to run this secondary investment so that we don't
490
00:33:04,665 --> 00:33:08,045
lose sight of the sunk cost we already have, you know, proposed value?
491
00:33:08,305 --> 00:33:10,125
Get in there, iterate real fast.
492
00:33:10,505 --> 00:33:15,885
You do have to R&D it. There's a concept that I'm sort of obsessed with on the
493
00:33:15,885 --> 00:33:18,325
side. It's sound metric trees.
494
00:33:18,685 --> 00:33:23,025
One of the things that I find very useful in a well-constructed metric tree
495
00:33:23,025 --> 00:33:25,905
is that That idea of isolating the
496
00:33:25,905 --> 00:33:30,105
portions of the business and the proportionality of the business impact.
497
00:33:30,225 --> 00:33:34,105
You can isolate the business impact at the right level.
498
00:33:34,805 --> 00:33:38,465
You can't drink two consecutive cups of coffee without somebody asking you about
499
00:33:38,465 --> 00:33:42,545
the value you're driving or articulating the value you're driving or quantifying
500
00:33:42,545 --> 00:33:43,625
the value you're driving.
501
00:33:43,705 --> 00:33:48,825
Finding ways to make you more immune to being led off track into shiny things
502
00:33:48,825 --> 00:33:52,625
that might be super cool, but are going to take a lot of time and impact a very
503
00:33:52,625 --> 00:33:56,225
small amount of the business seems like a survival skill these days. Yeah.
504
00:33:56,385 --> 00:34:00,945
So you, in order to do that, you've been using this idea of creating kind of
505
00:34:00,945 --> 00:34:04,645
a visual metric tree to state, really quantify the impact of the ask.
506
00:34:05,165 --> 00:34:09,785
I actually just wrote down metric trees because my brain just started firing
507
00:34:09,785 --> 00:34:14,445
on all cylinders as you said that, because there's always a challenge in terms
508
00:34:14,445 --> 00:34:16,225
of being able to articulate ROI.
509
00:34:16,445 --> 00:34:20,365
I've done some workshop on this. And the first thing I always ask is,
510
00:34:20,365 --> 00:34:21,985
is do you really understand what drives the business?
511
00:34:22,605 --> 00:34:26,645
Every ask out there has to help whatever drives the business.
512
00:34:26,805 --> 00:34:31,425
And if it doesn't naturally impact it and you're not able to quantify it, then you have no ROI.
513
00:34:32,090 --> 00:34:36,050
You can understand whether it influences some of that, but even the impact of
514
00:34:36,050 --> 00:34:37,670
the influence can be measured.
515
00:34:37,870 --> 00:34:41,490
I love that idea of metric trees, having that within an organization.
516
00:34:41,690 --> 00:34:46,450
And that's an opportunity for these data catalog companies to actually create
517
00:34:46,450 --> 00:34:47,450
it as part of the catalog.
518
00:34:47,690 --> 00:34:52,490
Data catalogs are actually pretty flat. There's no this metric impacts that
519
00:34:52,490 --> 00:34:55,310
metric impacts that definition, et cetera. Right.
520
00:34:55,430 --> 00:34:59,610
And it is that relationship that's key. And it's why I get fascinated by it
521
00:34:59,610 --> 00:35:01,830
and really excited by it.
522
00:35:02,090 --> 00:35:04,630
That you can create an ROI story about anything.
523
00:35:05,230 --> 00:35:07,950
You've got a whole concept of what are you trying to optimize?
524
00:35:08,110 --> 00:35:10,170
What are you trying to maximize? What are you trying to minimize?
525
00:35:10,470 --> 00:35:14,890
It's all interrelated and impacted by business strategy.
526
00:35:15,230 --> 00:35:17,910
You know, am I trying to, you know, increase my brand awareness?
527
00:35:18,110 --> 00:35:22,470
Am I trying to increase my local or high intent conversions?
528
00:35:22,610 --> 00:35:25,370
Like it's all in the context of business.
529
00:35:25,570 --> 00:35:28,990
And if you look at the metrics by, I like the word you used,
530
00:35:29,050 --> 00:35:35,810
flat, You've lost the context because you can be focusing on minimizing or maximizing
531
00:35:35,810 --> 00:35:40,670
a metric that actually is like three levels deep from a more meaningful metric.
532
00:35:40,750 --> 00:35:45,450
And you're focusing all your effort on, let's say, you know,
533
00:35:45,450 --> 00:35:48,850
minimizing something that's supposed to minimize costs and it's responsible
534
00:35:48,850 --> 00:35:51,630
for, you know, less than 5% of your overall cost.
535
00:35:51,730 --> 00:35:55,250
Until you map it out and really start to drive those relationships,
536
00:35:55,570 --> 00:35:59,030
you lose the context. Next concept's been around for a long time,
537
00:35:59,070 --> 00:36:00,390
whole balanced scorecard.
538
00:36:00,510 --> 00:36:04,550
Let's understand how our business actually runs and what are all the drivers
539
00:36:04,550 --> 00:36:05,670
to this part of our business.
540
00:36:05,870 --> 00:36:10,070
And that could lead to not only just for the sake of understanding the value
541
00:36:10,070 --> 00:36:12,950
of the data, the impact and the proportionality, as you said,
542
00:36:12,950 --> 00:36:18,270
of the value of those things, but also how we report on it and how we look at it.
543
00:36:18,350 --> 00:36:21,150
There's so much value in that and nobody does it.
544
00:36:21,210 --> 00:36:25,110
And then you can also almost circle around it because you have different parts
545
00:36:25,110 --> 00:36:27,550
of the the organization that all feed into the same number.
546
00:36:27,630 --> 00:36:31,750
So you can kind of start looking at things differently in that respect as well.
547
00:36:31,910 --> 00:36:33,690
These parts of the business work in isolation.
548
00:36:34,170 --> 00:36:38,650
And then somebody at the top who needs data from all different parties is sitting
549
00:36:38,650 --> 00:36:42,070
there going, well, I don't understand why I can't get this number.
550
00:36:42,230 --> 00:36:46,490
Let's go look at the metric tree. So let me tell you how data comes into a warehouse.
551
00:36:46,610 --> 00:36:50,710
Now I'm going back to that metric tree and telling you it's because I have 18
552
00:36:50,710 --> 00:36:53,490
different metrics that impact that number, and I can't get six of them.
553
00:36:53,849 --> 00:36:58,009
Right. Yeah, I can't get six of those. Or you show somebody at that level that
554
00:36:58,009 --> 00:37:01,189
you're talking about, you know, you take a C-level person, they can then do
555
00:37:01,189 --> 00:37:03,289
the kind of that mental math and be like, okay,
556
00:37:03,509 --> 00:37:08,649
there's some key areas of concern here in the pipes that are causing me to not
557
00:37:08,649 --> 00:37:09,669
be able to get this number.
558
00:37:09,789 --> 00:37:13,049
How does that help me contextualize my priorities? Exactly.
559
00:37:13,109 --> 00:37:17,509
Because we hear time and again, you know, I'm asked what the value of my data
560
00:37:17,509 --> 00:37:22,889
is, and organizations struggle to articulate that. So I think this is absolutely fascinating.
561
00:37:23,109 --> 00:37:28,509
I'd love to kind of pick it. How do we, you know, roll that out to our teams
562
00:37:28,509 --> 00:37:32,349
to ensure that they are looking at developing those metrics,
563
00:37:32,589 --> 00:37:38,329
developing those metric trees, and then actually communicating along those lines? Yes.
564
00:37:38,609 --> 00:37:42,569
So I think I've hinted at it. It's kind of where I'd stake my,
565
00:37:42,569 --> 00:37:46,389
you know, my flag is on almost like back to the basics type of stuff,
566
00:37:46,529 --> 00:37:51,229
which is dispense with a lot of the lofty ivory tower stuff and give people
567
00:37:51,229 --> 00:37:56,209
real project with clear boundaries and clear objectives to work on together.
568
00:37:56,209 --> 00:37:59,909
What I'm really talking about is the fundamental team building and trust building,
569
00:38:00,049 --> 00:38:04,329
the type of synergies that comes out of real teams solving real problems together.
570
00:38:04,629 --> 00:38:09,029
You sit down in a room together with a whiteboard and you spend a couple weeks
571
00:38:09,029 --> 00:38:10,429
really solving a tough problem.
572
00:38:10,609 --> 00:38:16,049
There's some real teamwork and trust and shared experiences that comes out of
573
00:38:16,049 --> 00:38:17,749
that. They're going to share information.
574
00:38:18,049 --> 00:38:23,009
They're going to learn to trust one another in the situations where they don't
575
00:38:23,009 --> 00:38:23,929
understand one another.
576
00:38:24,109 --> 00:38:29,129
You need that because you can't take somebody who's entrenched on the data side
577
00:38:29,129 --> 00:38:30,529
and teach them everything about business.
578
00:38:30,989 --> 00:38:34,149
I'm sure you can, given enough time and their inclination, you could.
579
00:38:34,149 --> 00:38:37,569
You don't have that type of time and money in business and vice versa.
580
00:38:37,649 --> 00:38:41,189
You can't take somebody on the business side and you don't have the time to
581
00:38:41,189 --> 00:38:43,089
make them a fully skilled data practitioner.
582
00:38:43,289 --> 00:38:48,189
You have to learn to trust others when you don't understand them and those real
583
00:38:48,189 --> 00:38:49,609
world practicum practices.
584
00:38:49,880 --> 00:38:55,080
Activities seem to be really key. And I would tell any data leader out there
585
00:38:55,080 --> 00:38:57,920
to find those opportunities and rally around them.
586
00:38:58,060 --> 00:39:04,000
Yeah. And I would add to that as well, making it okay to take a risk and potentially
587
00:39:04,000 --> 00:39:07,720
fail in that exercise. That's key. That's super key.
588
00:39:08,120 --> 00:39:14,040
But I heard the saying, you don't succeed or fail, you succeed or you learn.
589
00:39:14,240 --> 00:39:17,580
And I forget where it came from. I think it was either some commercial or some
590
00:39:17,580 --> 00:39:20,960
like athlete or something said it, but it really is true.
591
00:39:21,200 --> 00:39:24,680
If you look at it the right way, when you fail, that's it, you learn.
592
00:39:24,820 --> 00:39:28,700
And really engendering that within our teams is really core.
593
00:39:28,920 --> 00:39:31,760
Yeah, I think it was actually a Tiger Woods commercial where at the end,
594
00:39:31,780 --> 00:39:36,660
his dad, I think so, where his dad says, and what did you learn from that?
595
00:39:37,560 --> 00:39:42,620
So I know we're coming up on time. So just one last question for you, Rachel.
596
00:39:43,000 --> 00:39:49,560
If you can have dinner with anybody from history over time and discuss the future
597
00:39:49,560 --> 00:39:52,700
of data, who would it be and what would you ask them?
598
00:39:52,880 --> 00:39:56,100
This is one of the hardest questions that anybody ever asked.
599
00:39:56,240 --> 00:39:59,260
I think this might be a little weird of an answer because it really,
600
00:39:59,360 --> 00:40:03,280
in my opinion, doesn't have anything to do with data, except for in a tangential way.
601
00:40:03,580 --> 00:40:09,200
But I would pick somebody like Sophie Germain or Nellie Bly.
602
00:40:09,460 --> 00:40:13,100
There are people in history, little known, usually obsessed with,
603
00:40:13,180 --> 00:40:16,220
obviously, I'm a woman, And so I'm like, oh, what about the woman in the woman's
604
00:40:16,220 --> 00:40:18,160
voice and the lost woman's voice over time?
605
00:40:18,320 --> 00:40:23,320
I would want to sit down with them and I I wouldn't ask them anything about data.
606
00:40:23,400 --> 00:40:28,500
I mean, I would ask them, what do you do when it gets too hard?
607
00:40:28,880 --> 00:40:34,220
Because being in a technology space as it is, is tough.
608
00:40:34,360 --> 00:40:37,220
That's tough. Being in a space that is dominated.
609
00:40:37,889 --> 00:40:42,269
By other than women. And this isn't, you know, I'm speaking women-centric.
610
00:40:42,329 --> 00:40:45,329
This has to do with a lot of people who are underrepresented in this field.
611
00:40:45,809 --> 00:40:51,049
We've talked about so many tough concepts and challenges about getting people
612
00:40:51,049 --> 00:40:54,589
to work together and challenges about getting ideas across and all those things,
613
00:40:54,689 --> 00:40:55,989
right? They're all tough as it is.
614
00:40:56,089 --> 00:40:59,669
I think I mentioned earlier, I see a lot of things like a tug of war where if
615
00:40:59,669 --> 00:41:05,909
you let up for a second, you've lost ground that you now have to spend capital to make up.
616
00:41:06,029 --> 00:41:11,329
And there are days when you look around, you're like, do I have it in me to
617
00:41:11,329 --> 00:41:13,669
try to find the words that I can't find?
618
00:41:13,989 --> 00:41:18,949
Do I have it in me to try to really dig to understand that other position?
619
00:41:19,189 --> 00:41:22,589
Because I know that's what's getting in the way of this. And you look back and
620
00:41:22,589 --> 00:41:26,249
there's just these amazing people that did amazing things.
621
00:41:26,409 --> 00:41:30,669
Sophie Germain, she had no support, no representation, no models.
622
00:41:30,669 --> 00:41:36,829
She completely dominated a mathematical space, basically was able to bring to
623
00:41:36,829 --> 00:41:39,569
fruition a math theorem that many tried and failed.
624
00:41:39,569 --> 00:41:43,849
Nellie Bly picked one of the most unrepresented groups of the time,
625
00:41:43,889 --> 00:41:48,509
like mental patients, knew they would be treated unfairly. She was a journalist.
626
00:41:48,749 --> 00:41:52,669
She could have lived her life quite happily, I'm sure, reporting on things other than.
627
00:41:52,829 --> 00:41:57,569
She went undercover in a mental hospital where she knew people were being treated
628
00:41:57,569 --> 00:42:03,389
abhorrently and used that to drive amazing and lasting change in the world.
629
00:42:03,489 --> 00:42:05,769
Those people could really tell you what was tough.
630
00:42:05,969 --> 00:42:09,509
I'm not minimizing anybody's struggles, but I'm curious, what did they think
631
00:42:09,509 --> 00:42:13,609
about when the days were hard? Or did they just not think about it? Was that their secret?
632
00:42:13,809 --> 00:42:17,209
I want to know that. And it might not be the best dinner conversation.
633
00:42:17,629 --> 00:42:21,249
But I would love to have that conversation.
634
00:42:21,669 --> 00:42:25,569
I think that'd be a fascinating conversation. I love those figures.
635
00:42:25,749 --> 00:42:27,969
I love their stories and what they
636
00:42:27,969 --> 00:42:34,469
overcame to really drive that long-term improvement in people's lives.
637
00:42:34,469 --> 00:42:40,209
And so much of what we see in data isn't the technology, but it's the people,
638
00:42:40,309 --> 00:42:43,289
the people surrounding the data and the people working with the data.
639
00:42:43,289 --> 00:42:46,409
I love the data is going to change. It's going to get bigger.
640
00:42:46,489 --> 00:42:47,309
It's going to get different.
641
00:42:47,429 --> 00:42:50,189
It's going to have more privacy rules. It's going to have less privacy rules.
642
00:42:50,649 --> 00:42:55,429
But the people are going to stay, you know, the consistent part,
643
00:42:55,589 --> 00:43:01,949
you know, up until the generative AI LLMs are, you know, embedded in the cyborgs
644
00:43:01,949 --> 00:43:06,349
and those take over, you know, next week. Then we'll all be talking about it. Exactly.
645
00:43:08,249 --> 00:43:12,369
Well, Rachel, thank you so much coming in on to the podcast with us.
646
00:43:12,429 --> 00:43:14,349
We really appreciate it. And thank you both.
647
00:43:14,509 --> 00:43:18,209
I really appreciated the opportunity. And this was by far the best.
648
00:43:18,640 --> 00:43:34,817
Music.