Deep Dive Into Experimentation With Lea Samrani (ex Bumble, Badoo, Uptime)

Lea: If you look at a product today
that have what I would call a shitty

experience, really, that have, you know,
a really poor UI, like a lot of different

UI, it's just, it's very inconsistent.

It's hard to use, there's stuff
popping in there and there.

Automatically, as an end user, you
lose your trust with that product.

You think this is a bit spammy.

This looks like, you know,
the internet of the 2000s.

Like, This is not something
I'm okay with today.

And um, I think you're a lot
less likely to actually put

your money into this product.

Christian: The discipline of design
is now key to building great products.

More and more companies are making
space for it at the higher levels.

More people than ever
want to become designers.

And most of us who already do the
job wants to find ways to have just a

little bit more impact in our teams.

Welcome to design meets business.

I'm Christian Vasile and on this
podcast, I bring you world class

product and design leaders who found
ways to shape products, companies,

and entire industries, and who are now
sharing what they know with you and me.

My hope is that we all get to learn
from the experiences, ideas, and

stories shared on this podcast.

I'll see you next time.

In the process.

become better designers.

Today, I'm chatting with Lea Samarani,
who's one of the best, if not the best

product manager I've ever worked with.

Since we've worked together, Lea
has become a consultant and is doing

great work for her clients, helping
them scale and grow their products.

Today, we're talking about how design can
build better relationships with product,

what kind of designers collaborate well
with product people and we go super

deep into talking about experimentation.

I hope you'll enjoy this one.

Lea, very excited for our chat today.

Welcome to Design Meets Business.

If someone is looking over your
CV, they'll see some cool companies

there, Badoo, Bumble, Uptime, and
more recently, a lot of different

startups that you're advising.

But if they look even further,
they'll see some great results

you've helped these companies with.

Whether that's increasing conversion
rates or increasing MRRs or helping

them remove other barriers for growth.

And I know you're a big fan of
experimentation, moving fast

and taking calculated risk.

Just the type of product person
that I love partnering with.

And I'd like to dig deep today
and see what designers could learn

from you to replicate some of these
successes that you've had with the

teams you've been working with.

But before we begin and go into
all of that, please tell us

a little bit about yourself.

Lea: Hi, Christian.

Thank you so much for having me.

I'm very excited to be here today.

What a nice intro as well.

So yeah, my name is Leah.

I've been in product for
about 13, 14 years now.

I've actually started my career
in marketing and then move over

to product when I realized we
can get results much faster.

Clearly I'm not a very patient person.

And then from there I moved into
experimentation really quickly and

then into a bit of growth as well.

So it's been very interesting
over the last few years.

I've worked with a lot of different
apps, lifestyle, dating, health,

fintech healthtech as well, like very
different category, but all of those

products have something in common
and it's always a trying to build a

great experience for the end user.

And my job has been to find what this
great experience is for the end user,

which will also make the company money
and grow and be sustainable over time.

So that's what I've been
doing for the last few months.

It's very interesting.

I learned a lot.

It's very exciting.

And I hope I can keep doing
that for the next years to come.

Christian: Yeah, for sure.

Recently, you've moved from more
permanent roles to advising, right?

And you're working with a lot
of different companies now.

How is that?

How has that changed your daily life
versus just being on one company?

Lea: It's very different.

Being in house, you're very
focused on one problem.

You spend your entire time thinking
about that one problematic.

You manage a lot of internal stakeholder.

You have to play the internal games as
well, the politics and all of that stuff.

Being external, none of that
matters, so it's way better for me.

However, you need to be very good
at context switching, because

there's a lot of context switching.

You still need to be able to dive in very
deep into subject matter, even though you

don't have the same level of information
as you would have if you were in house.

There's a bit of balance as well,
because very often as an advisor,

you're responsible for the strategy,
but you're not actually executing.

And, we very well know that anyone
can have an idea, but the execution

is actually what makes a difference.

So you don't have that much control on
the execution part of things, which tend

to be very important for the end results.

So it comes with a very different set
of challenge than being in house, but

you do get to meet a lot of people
from very different backgrounds.

You get to work on completely
different kind of products, different

size, different problematic.

I've learned so much about things
I knew nothing about, like recently

I've, I've helped a product that
was growing fish on Mars, what do

I know about growing fish on Mars?

It's a, it's just very wide range
of product on, and it's amazing.

I feel so, What's the word?

I felt so lucky that I
get to experience that.

I

Christian: think one of the aspects of
my career that allowed me to evolve faster

is that I've done something similar where
I just consulted for a lot of different

companies versus just being in house.

And I think with similar experience
there, you learn a lot about a lot

of different things, but not only
that, but you also learn a lot about

how to handle different types of
stakeholders in different companies.

What's the difference between a B2B sales
driven company versus a consumer app.

And I think learning all of these puts
more tools in your tool belt . You

said that as a consultant it's harder to
control the implementation of it and the

execution, which as you also said, it's
probably the part that matters the most.

So how do you then keep track of the
execution and try at least to control

or to shape that in a way or another
when you're not there every single day.

Lea: So depending on the company
size on the people that are working

on something, it's very different.

If you have people that you work
directly with that are going to do

the execution it's a bit easier.

You can follow with them a bit closer.

And some of the way to do that
is just being very specific about

what is it we're trying to do.

You know, I'm not coming up with
big plan, long term roadmap ideas.

It's very specific.

This is what we're going
to be doing next week.

And this is what we're going
to be doing this month.

You break that down into a
smaller chunk of work that are

easier to go into execution.

And if you work closely with those
people, you can give feedback on a regular

basis and be much closer to the project.

Now when the company is, bigger or maybe
more spread out, there's more stakeholder,

it gets way more challenging because you
can only affect a certain area of it.

So in that case is really
more around process.

In my experience where you try to shape
a bit more how people are working with

each other and how the company process
are enabling you to apply that strategy

rather than the actual execution of.

the strategy you suggesting

Christian: so a lot of the people,
most of the people listening to this

are designers and most of them are
early- ish in their career or sort

of around mid level, senior level.

with that in mind, not all of them have
had a chance of working with Product

people before, if they work in super
lean startups and not all of them even if

they had the chance to work with product
people, they perhaps haven't necessarily

worked with a good product person.

So let's start from that baseline.

If you're a designer out there and you
think what does a good product person do?

Help us understand what's
that ideal person like?

Lea: Product is a very interesting
term because in different company

means something very different.

I had the luxury of working with
some of top of the market company

where product is very well done.

So I have seen first hand
very good product people.

And for me, what sets them apart is that
They had a very good understanding of the

strategy from the business perspective,
they had a very strong understanding

of the customer, the end user, the
person that will be using your product

with a lot of empathy for those people.

But they were also very practical people.

So they had that capacity of taking
all that uncertainty and all that

knowledge and turning it into
very small, applicable decision.

The second thing they were really good
at was ensuring that everybody in their

team understand why we're doing what we're
doing and so that allow them to create a

product mindset across the entire team.

And so suddenly you're no longer the
only product person in that team.

You're not the product manager.

you are the one shaping
your product mindset.

So the entire team
become a product manager.

And I think we see that more
and more recently, like you

yourself, you're not a designer,
you're a product designer, right?

Product designer, frankly, can eventually
replace product manager and they should

because it's almost the same set of skills
with an added hands on design experience.

So I think that's a really great
product manager that I've seen now

in most company, that's not the case.

In most company, product manager
are project manager they do a lot of

admin, they just move things around.

They don't get to really
decide what they're working on.

It's very often stakeholder led or
sales led or finance led depending

on what company you work with.

And then it's more about you trying
to push that vision to life or quite

often as well as just trying to keep
your team happy or explaining why you

stakeholder have decided something when
you don't really know yourself why.

You just kind of that middle management
person and this is probably the worst

Application of product and I think
that's what gives product manager a

bad name because it's not It's it.

Yeah, it's you just a product
manager by name in this case,

but not really in function.

Christian: It's hard sometimes to
sit into one of those teams and have

a product manager Who's actually
just a project manager, right?

And then as a designer, you think a lot
of the value that comes from a product

person is around the business strategy and
the things that you've already mentioned

earlier versus a project manager.

Just make sure that everything
gets delivered on time.

So then what it does is that it forces
you as a designer to borrow some of

those skills that the product person
otherwise would do and do them yourself.

And we talk a lot about
being more business savvy as

designers on this podcast.

And I guess for me, anytime I look at
a good product person, for me, they're

always just very business savvy.

And if we're able as designers to learn
those skills I don't think it's a matter

of replacing, as you said earlier, but if
you work in a smaller company that doesn't

have a lot of resources and has a design
team and an engineering team, but doesn't

have resources for product team, then I
think as a designer, you can fulfill some

of those duties yourself, because as you
said, they're very complimentary skills.

So, One of the things that I believe
in is that in this triad, engineering,

product and design, one of the aspects
that sets good teams apart from lesser

good teams is the relationships that
they're able to build with each other.

I think that the strong relationship
between design and engineering, strong

relationship between design and product,
those oftentimes push teams further than

if those relationships are not good.

So I wanted to talk to you having
worked with a lot of designers

in the past in different teams
and different size of companies.

How do you as a product designer
build good relationships with

your product counterparts?

Lea: It's always about
understanding what the other

person role and responsibility are.

What are your problem on
the day to day of the job?

What are the challenges,
where do you need support?

If you can really put yourself in
somebody's shoes, you can build

that relationship with them.

And for me, I always start with why.

It really is.

It's like, why are we
doing what we're doing?

Who are we doing it for?

Are we aligned on that?

That's the first step.

If we all, the three of us, and
actually I would argue the four of

us because I do believe analytics
has its place there as well.

If the four of us are aligned on.

Why we're doing what we're doing
and who we're doing it for,

then the rest kind of flow.

So you all have different skills and you
all have different background, which allow

you to see something from a different
perspective, but you do that with the

same objective for the same end user
and to try to solve the same problem

and understanding why you're doing it.

And so that allow you to
have the same foundation.

And the second you have the
same foundation, everything

else naturally flow.

Then there's respect that comes
with a job there's mutual respect.

You can't, or at least I can't go to a
designer or an engineer and did you say

like, why did you do this thing this way?

Do it that way.

Instead, there's respect.

I have respect for the
skills, for the experience.

So if you're questioning something, it's
more around take me through that past.

Tell me why you decided to do something
a certain way so I can understand your

point of view rather than essentially
saying what you did is a bit icky.

Let's, do it my way.

And then, it's just a human side to it.

Like, how do you build
relationship in your life?

There's just basic stuff, right?

Be kind to people, talk to
them, try to have a decent

like normal human relationship.

If it's outside of work,
sometimes it's even better.

That used to be a bit
of a bit easier before.

Right now, you have to make a
bit more of an effort to try and

talk to someone outside of work.

But those are just like basic human thing
that at the end of the day, when you get

along with someone, it's also easier to
work together and you don't have to agree.

Frankly, I rarely agree with
my engineering counterpart

or my designer counterpart.

We very often disagree, if
anything, and that's very healthy.

I think it's very healthy because that's
how you get the best of each other.

But the disagreement is
always done within respect of

understanding each other's skills.

And within that foundation where we
all on the exact same page on why is

it that we're doing what we're doing
and who are we trying to do it for?

So the disagreement is on
the what, and that's okay.

Christian: you mentioned conflict there
or not conflict, but disagreements rather.

And it's something that probably
every designer ever who's ever

worked with a product person has had.

You bring a design forward, you
show something that you've put your

heart and soul in, and a PM looks
at it and thinks, this is not great.

And you have a good product person, they
might approach it a bit differently.

They might ask questions rather
than just tell you their opinion.

Oftentimes what happens is
that someone comes in and

gives their subjective opinion.

Oh, I don't like this because I
don't like pink or whatever, right?

As a designer, that's really hard
to deal with that because that's

a personal opinion, right?

You don't have anything, any
data to back that up with.

You're just like, I just don't
like it because I don't like it.

So in that case, I think disagreements
can sometimes lead to a bit of animosity

and a bit of conflict in the team.

How do you encourage designers to
deal with product people who give them

feedback based on, personal opinions.

Is there any way to push back?

Is there any way to ask further
questions to understand where

that feedback is coming from?

What do you think designers
should do in that case?

Lea: Yeah, absolutely.

It's always the same, right?

It's always about understanding why
you're saying what you're saying.

If a product manager is giving feedback
on design, it might be because they've

actually done a lot of user research and
they understand the problem better than

the product designer, which frankly, that
would be a bit worrying to start with.

It could also be because they had that
vision in their head and that vision

is just something they're struggling
to get away from because that's just

what they see and what they think.

So in that case, that's a bit
harder, but you can again, try

to understand what is it they're
trying to achieve with that vision.

So it's not necessarily saying why
would you put, you know, I don't

know, that CTA on that page, but more
around what are you trying to achieve

by putting that CTA on that page?

What do you think that
will enable the user to do?

How do you think somebody is
going to feel when they see that?

Or what action do you expect
somebody to take when they see that?

And I think that allow you to understand
a bit more what they're trying to

achieve and then find some sort of way
to deliver that, which can be a mixed

vision, it can be yours, can be them.

It doesn't matter that much.

Sometimes it will change.

But it's really trying to understand that.

And very often we come at it
from a different perspective.

When you look at something from a design
perspective, you do want it to, I suppose

you do want it to look good, right?

To look nice.

It needs to be beautiful.

It's art.

It's a bit of art, right?

From a product perspective, I don't think
you come at it from that art view it's

way more around the practicality of it.

I want to know if something's going
to be easy to develop, to release.

I want to know if something
has been tested before.

And so we have a bit more certainty
around it rather than maybe something

brand new that's maybe a bit more
risky and maybe that's not really the

right time to take that level of risk.

I want to know if this is going to
solve the problem that we're trying

to solve in the most efficient way.

And if that's not the case, then,
there's definitely going to be

some sort of a discussion there.

And the last bit is a bit around the
overall look and feel and the thing,

I mean, that should be fully owned by
design really, but there are some case

scenario where the product manager
is going to provide feedback on that.

And that could be either because
the designer is maybe a bit junior

and so need that kind of support.

Or because the product manager is just too
opinionated, so on the one hand that's

good, the person can be a bit junior,
can use a bit of feedback and develop.

But I do get concerned when your product
manager knows more about Or has a a

better eye, I would say, than your
product designer, like that's so rare,

if that's the case, I'd get concerned.

And on the other hand, it's just
setting like this healthy boundary

from design to product, which is
really again, like going back to,

what is it you're trying to achieve?

What do you expect the person to do there?

How do you expect them to feel?

And then go back to your work
and see if that achieve that.

Christian: You said something
there that you've brushed past, but

I'd like to bring it back because
I think that's very important.

You mentioned a couple of things that
a product person might care about.

Is this going to solve the problem?

Is there any way you've
tried to mitigate risk here?

All of these things.

And it goes back to that idea
of you need to understand what

the other person cares about.

And you as a designer, when you're
looking at a design, you poured

your heart and soul into it.

You care about how it looks,
especially people earlier on in

their careers, care more about how it
looks than the business part of it.

Because frankly, design
education doesn't teach you that.

So that comes with experience.

So if you're earlier on, you're much more
likely to just put something forward that

you just like how it looks as soon as you
understand what other people care about.

And that's not only product people,
by the way, that's also engineers.

They also care about.

If you come up with this fancy,
crazy animation that looks good in

Figma or whatever, and then you show
that to an engineer and he or she

thinks this is going to take me days.

Can we have something simpler?

It's not because they don't like it.

It's because they care about being able to
ship fast and they care about not having

to write a lot of code that will create
tech debt and all of these other things.

And it's the same with product.

They care about does
this solve the problem?

Is there a risk here
that we have mitigated?

So I think when you present work
from a perspective of what does this

person care about and what should
I talk about when I present the

work covers what they care about.

So if they care about Does
this solve the problem?

In your presentation of the
work, talk about how you

think this solves the problem.

If we present work like that, I
think that can also resolve, pre

resolve, ahead of time, some of these
issues that might otherwise arise.

I'm going to ask you a question that
you might or might not have an opinion

on, but what is an ideal design
partner for you as a product person?

Lea: So the designer I got along
the best with I think our work

probably was some of, my proudest
work that had the best result where

people that were very communicative.

So very early on, they asked for feedback.

We have a lot of back and forth we
did, she had a that slack channel

that just never stopped beeping.

Oh, I had an idea.

I thought about that.

What'd you think of that?

And this constant back and forth.

And it's not, I heard like a couple
of times like this is designing by

committee and I really disagree with
that for me, this relationship is so

important and bringing those feedback
so early on, it allow you to avoid this

big clash at the end, it completely
avoid the actually you didn't really

get my vision, or we went a different
direction, or you have to justify

why you're doing what you're doing.

just because you have that
relationship early on.

So having that person that's very
communicative and actually is happy to get

feedback early on and share their idea.

And it's not pixel perfect.

If we're talking about an idea, I'm
not going to look at the details.

Like I know not to look at the details
because you told me it's just an idea.

And so I really like that.

I think that really get us to a place
where at the end of the work, I'm

always like, wow, this is amazing.

I also have a preference for product
designers that understand data.

It is sometimes a little bit challenging
when you work with product designers

that care a lot about something.

And I get why they care about it, but that
thing they care about will be seen by two

people , and then as a product person.

At least as a business person, what
the work we're trying to work on

is something that need to impact
a lot of people, a lot of users.

Unless, you know, we're
like that huge team.

Perfect.

I can spend time on like
that tiny bits of the floor.

That's great.

But often that's not the case.

So if you're going to spend a lot
of time on something that's going to

affect two or three people, this is
something that I'm going to question.

But if you're someone that
understand the data and understand

how you work, affect the end users.

That's never really a question because
you would, from a product design

perspective, you'd come and say,
actually, does that really matter?

Like only two people
are going to see that.

Can we move on?

So I think that's quite interesting.

And then the last bit
for me is around copy.

That's a whole other subject really,
but in most organization, I think what

you'd see is design gets done and then
copy is almost like just placeholder.

And then he goes to a copy person
and then do the copy the end result

doesn't really make a lot of sense.

It's a bit patched up.

So there's ways of working when you
can do like content design where you

know the copy is very integrated very
early on, but you can also have a

product designer that's thinking about
what are the messages we're trying to

communicate there and how should we
communicate that and what's the best.

Form to do so when they're
designing something.

So we never really end up with, I don't
know, space that don't really make

sense or steps that don't make sense.

And I guess that can be brought
back to a bit of the psychology of

what we're doing, who we're doing
it for and why we're doing it right.

And not just thinking actually I
have to design three screens, but

what is the intent of my screen?

What is it trying to achieve and how
is the best way for me to communicate

that both visually and verbally?

So yeah, so those are like the three
skills, you know, over communicate,

feedback, early relationship data
awareness, and if you don't know what

the data are or how to get them or
what they mean, that's what the product

manager is here for as well, or your
product analyst is something you can

easily go and ask and then the last
one is really understanding, what are

we trying to achieve, but not just with
design as in, How is my work communicating

to the end user and how is my work
solving the problems the end user and

integrate copy as well as visuals?

Christian: I'll pull on two threads there.

One of them is about understanding
analytics and the fact that if you

don't understand analytics, that's fine.

But there are people
in the company who do.

And then it's just as simple as going
and saying, Hey, can you explain this?

I'm looking at these graphs.

I've been looking at them for 10
minutes and I understand nothing.

Is there any way you
can explain this to me?

What I've learned, and I remember I was
at British Gas back then, and we had

this very complicated analytics platform
and I was swimming in the dark a lot.

And I went to our analytics
team and I said, can you just

please explain this to me?

And what happened is that they've
realized that someone in the team was

actually interested in their work.

They ended up for weeks, just bringing
analytics to me because they finally

thought, okay, we have someone in the
design team, a partner in the design team

who understands the value of what we do.

So then again, that's also going back to
building relationships is being interested

in what the other person is doing as well.

So that's one of the thread that
I think it's worth talking or it's

worth highlighting again, the other
one, I am a big believer in the fact

that designers should write copy.

I will never move away from that now.

With that being said, you
don't necessarily need to

write absolute perfect copy.

It doesn't always need to make perfect
grammatical sense and all of that.

But if as a designer, you write
your copy as you design, then it

will feel like a better experience.

And then what you can do is to bring
over a copywriter or someone who.

Who can just polish that for you,
but I always find it strange when you

design something and you put lorem
ipsum in it, like the content is part

of the design, the design without
the content is doesn't make sense.

So for me, you have to write your copy
and if you need someone to check it

afterwards, make sure it's, tone of
voice, whatever fine, but at least

the direction of the copy, I think
that's important to write there.

There's another thing that we haven't
talked a lot about, but you are a

very big fan of experimentation.

And I like to pivot a little bit into
talking about the experimentation

and your background in it.

And then how can designers fit
into that framework or into

that way of working a little bit
better than perhaps they do today.

So let's talk experimentation.

What are your thoughts there and
what sort of successes have you

had experimenting in the past?

Lea: I love experimentation.

That's not a secret.

I've learned experimentation from,
people that were really good at it,

where the philosophy was just everything.

Always where we had the luxury of
having hundreds of thousands of users.

So it was very easy to test where
we had a setup for it, we had

analytic team, we had good data.

So it wasn't perfect by any mean, but it
was a very good setup and a really good

structure to learn how to do it well and
to understand the power of it as well.

And then I've been involved now in.

multiple project where we try to
brought experimentation to a new

company that haven't adopted it yet.

And I'm always amazed at some of
the early results you get when you

start experimenting with a company.

Because it's not just
about let's A, B test.

It's really about let's change
people mentality into thinking

everything we do is a hypothesis.

And nothing is just set work.

And when you start thinking that way,
when you start thinking everything we're

doing is a hypothesis that we need to
validate or invalidate, it completely

change the way you make decision.

It changed the way you
think about your work.

It tend to make you go a bit faster, at
least I believe, because If you do it

well, you start thinking like, what's
the smallest way for me to validate that?

And it's just validation and
validation doesn't necessarily

need to be an A B test.

It's better if you can, but some company
don't have enough user or, they don't

have enough engineer to build it.

And then validation can happen before the
product or you can also even just fake

it with with copy or with ads or anything
else that you can do before the product.

But that mindset becomes.

Everything we do is a hypothesis that we
need to validate in some shape or another.

And then you end up setting your team
in a way where they work on like much

smaller timeframe every week or every
other week you start, releasing,

pushing new things to life and you start
judging your work with a different lens.

So it's not Oh yeah, we
finished that project on time.

We're happy.

We built that big new feature.

Let's launch it.

It's really about how many
people have adopted it.

Is there a repeat usage of it?

Are people more retained once
they go through that feature?

What are the feedback on it?

Can we also get user feedback on
it, like qualitative information?

That's also part of experimentation.

And then if you have the luxury of
big data, absolutely fascinating to

see how often something that's You
know, being requested or you thought

was so important is actually not
received like that by user at all.

And I've seen that so many times.

I've seen like the most
requested thing and it are being

adopted by, 2 percent of users.

And this user actually were not more
engaged or more retained than other users.

I've also seen things that are very
controversial from a business point

of view or from a team perspective.

Something that we never wanted to
do or we thought was so annoying or.

Sometime even stupid, frankly, and
actually when you launch it, the

results is there and it just works.

So big data is very powerful, but
also understanding why things are

moving a certain way is important.

So experimentation for me is
one way of validating things.

It's not the only way.

You get your big data, you
understand how something have

affected your user behavior.

But then you need to qualify them.

Why has it affected them that way?

What is it that happened?

Qualitative there is also very powerful.

Christian (2): If there's someone
sitting here listening and hasn't

been in a team where experimentation
was an important way of doing product

development, how would you explain
to them why doing experimentation is

important versus the example you gave
earlier of we just launched this big

feature and let's, or we just built this,
designed and built this great feature.

Let's just launch it.

Why is experimentation an
approach that's worth considering

versus launching big things?

Lea: Because 90 percent of the time you're
wrong is that simple as that you're wrong.

I'm wrong.

We're all wrong.

People are not all the same.

The one that are vocal have a different
behavior than the one that aren't vocal.

The people we acquire us for a different
marketing channel, you know, different

algo right now that target people
will bring us different cohort of

user with different behavior as well.

There's a level of complexity
there that's way too high for the

human mind to even comprehend.

And that's why experimentation
allows you to do it, just allow you

to validate how what we're doing is
actually aligned with our user base

and or our business target, really.

And again, 90 percent of the
time it's going to be wrong.

It's a big celebration when a test
works because it's not that often.

And if you test are working really often.

It's because you're really early on
in that journey, and you're so far

from being an optimized product that
pretty much everything you do works.

So I've seen that as well when,
uh, when teams have really good

hypothesis, they have a really good
understanding of user problems.

They have a nice, discovery tree.

They put hypothesis, good design, go
and put it to test and then it works.

Let's say 70 percent of the case, right?

This is because a product
is really not optimized.

It has a good market fit, it's a good
product, but it's really not optimized

and your tests are working better.

The more your product become better,
the less likely your tests are to work.

And if you were not to test
you could think about it as in

90 percent of the case, we are
going to take the wrong decision.

Now, one of the problem that comes with
testing, and that's something I've seen

quite a lot, is people think, I built
this test, it's kind of an MVP version of

something, it's launched, it worked kind
of okay, let's move on to the next test.

And that's wrong as well, because
experimentation is a way to validate

a direction, it's not an end result.

So that feature that you just launched or,
that user journey or like tiny validation,

it's telling you to go into a direction.

It's not the end results, and it's
not something that you can just let

there and then move on because if
you do that, then you find yourself

with one of those products that have
tons of feature, but all of them are

half baked and none of them provide
a really good user experience.

And that's also very important, we've
seen a lot lately, I think of a pushback

around like the whole concept of MVP,
even around, does MVP really works?

If something is not well done
or well thought through, can you

even use it to measure results?

And then for me, an MVP is very
it's just a way of validating that

you're in the right direction.

And sometime just feature adoption
can be a way of you thinking

we are in the right direction.

People want that.

Now let's think about what that
feature should actually look like.

And then let's test the first version,
the second version, et cetera, et cetera.

And so we go into the.

The details of that user flow, but that's
not something I've seen a lot because

team are pressured for, delivery for
pace, because there's always a lot of

idea, a lot of project, a lot of OKRs.

I haven't seen very often team
that have that kind of space where

you can think here's a hypothesis.

Let me validate it first outside of
product, talk to users or do copy

test or an artist on email, whatever.

Okay, this is a good direction.

Let me now build a validation
inside the product.

This is my MVP.

It's a user flow.

It's not a feature.

It's an actual user journey,
but maybe a simplified journey.

We test it.

Okay.

It has a good number.

What is good means as well.

That's something you need to
set before you start testing,

but it has good results.

We seeing, I dunno, high adoption,
higher retention rates higher

engagement or monetization, like
whatever is it we're testing for.

Now, there's a step that comes
there again, which is how do

we evolve that feature now?

How do we evolve that user journey
into something that's actually the end

product or some sort of an end product?

And that step is usually
missing with experimentation.

And that's one of the reason
I believe why team are pushing

back against experimentation.

Because they see it as this like
feature factory of half baked MVP that

just, quickly get to life and then
disappear, but nothing else happened.

Christian: There's another component there
that I've seen a lot of times, which is.

It's very hard to hit that balance
between building an MVP that's going to be

validating your hypothesis versus building
an MVP just because we want to move fast.

So I think oftentimes
we want to move fast.

We perhaps don't understand the
idea of what an MVP really is.

And oftentimes we have a hypothesis
and we look at it and we think,

okay here's what we can build.

And then The fact of the matter is that
what you're building is not an MVP,

it's just an an MP, it's just a minimum
product, but it's not necessarily viable.

I mean, To me that, that part in the
middle of the viability of an experiment

is really important because if you half
bake your MVP and you put it out there and

people get this feature in their hands and
they think this is absolutely crap because

you haven't executed it well enough.

then they're going to drop, they're
not going to adopt that feature.

You're going to look at the analytics
and think, Oh, it didn't work very well.

Let's go into a different direction.

When in fact, the problem might
be the fact that your MVP was

just not well enough put together.

I've dealt with that so often.

How do you, ensure there's a balance
between how fast we move and making sure

it's a viable experiment that we're doing?

Lea: Yeah.

You nailed it there, right?

It's, it has to be viable

.
I even heard valuable as a new definition
of MVP, minimum valuable product,

which I think is quite interesting.

I'm sure you've seen that GIF.

I think it's been around for forever
around the the car MVP, which is like the

bicycle instead of the two, what is it?

Two wheels.

Christian: Two wheels.

Two wheels.

Yeah.

That's that.

That's not an MVP.

Yeah.

We'll put that in the show notes
for people who haven't seen it yet.

Lea: Yeah.

The application of that into
product is much harder, obviously.

Sometimes MVP is not something
you can do in five days.

Sometimes it's going to
still take weeks to do.

Sometimes MVP is not even
something on the product.

It's a design prototype that
you can test with people.

It's very hard to have a definition
because it depends on so many factors.

For me, MVP sometimes, it's how you build
something rather than what you build.

So you were talking earlier
about, beautiful animation.

Maybe that doesn't make
it into the first version.

If there's like backend infrastructure
needed, maybe that doesn't make it into

the first version, maybe we fake stuff.

If there's a lot of logic,
again, can we fake some of it?

And then, if it works,
then build the logic.

So it's a way of making our work a
bit faster and a bit simpler while

achieving the same end results.

Because me as a user looking at it, I
won't know that you fake that logic.

It will look real to me.

I will only know you faked it if I use
a product a lot for a really long time

and I see that actually doesn't change.

But my perception is that there's
an actual logic built into it and

you will see from my reaction, from
my usage of the product, if this is

something that's worth pursuing or not.

But it's not problem that is easy to
solve because it's a very common problem.

I agree with you.

That's something we see a lot.

Aaron.

And one of the reason I think we
see that a lot is because we don't

understand what success means when
we start designing something, when

we start building something, even
before, when we set a hypothesis.

What would success looks like?

And that's something that we need to
think about before we build it, because

it's going to tell us if what we're
building is actually worth building.

Is it going to answer a
question or actually, is it just

going to open more question?

Because we have no way of understanding
if that's a good thing or a bad thing.

You know, You learn something
and increase something by 2%.

Is that good or bad?

does that mean you should
spend more time on it or not?

Have you achieved the maximum you could
achieve there, or is that just showing

you that direction is worth pursuing and
potentially you can increase it by 50%?

And this comes from setting a very
clear understanding it comes from

setting what success looks like.

From the hypothesis stage and as
a product designer, if we go back

to that's something that you need
to understand as well to do your

job, how can you design something?

How can you create something if you
don't understand how success looks like?

How are we going to measure that?

It's part of that thinking as well.

That needs to come into the role early on.

Christian: Yeah.

And I think that's the responsibility
of a designer to never start working

on something until you understand.

What that something actually needs to
solve . I mean, Sure, if you just need

to change a color of something here,
there, that's not what I'm talking about.

But when it comes to hypothesis, when
it comes to bigger features, when

it comes to bigger pieces of work, I
think it's very important that it's

your responsibility to understand.

If you don't understand what problem
you're actually solving, it is then

your responsibility to say to your
product manager, to any, any product

counterpart or business counterpart
, what problem am I really solving here?

with this design before you
actually go and open Figma.

And yeah, so that, that's sort
of part of your discovery that,

that is an absolute requirement.

So talking about running some of these
experiments, how do you come up with

all of these ideas and all of these
hypotheses for what experiments to run?

Lea: Okay.

So assuming you have a product
already that's existing and live

and you have some level of data.

I love looking at those data
first to understand where is

it we should be focusing on.

So it's essentially
kind of a funnel, right?

It's a funnel and I'm looking
where, where are user dropping?

What's the stage where they're
not really getting passed out?

And it's not necessarily like
that one step in activation.

it can be anywhere in the product or
any kind of behavior, but I'd like to

understand how What part of the product is
the part that is used the most and which

part is not or how long someone is using
it for so we can draw this little bit of

a journey and understand where is it that
we should be focusing on because it will

affect the majority of people and a lot
of product that tend to be very early on

once a product is a bit more mature it
tend to be much later on and the later

it is the harder is it to get results
because it take longer to get there.

But assuming we have this, fresh
product and we realize our day

one retention is really low or
conversion to payer is really low

we are gonna try to understand why.

So we're gonna look at this data to see,
you know, are, are people registering?

Are they dropping during activation?

Are they reaching our aha moment?

Do we understand what the aha moment
is actually, are there many of them?

There's a, the whole concept where
actually it's not just one action that you

need to do to get value from a product.

It's a bunch of action.

It's a bunch of repeated activity.

So let's try and understand that
first and get that from the big data.

Now that we understand that.

I want to talk to people to understand
why are things the way they are.

And there, there's two schools.

There's one school, which is, let's
not talk about product at all.

Let's do like more job
to be done interview.

Let's talk about people
about what matter to them?

What are they pull and push and what drive
them into the day to day decision making?

What scares them?

What are the anxiety and
what are the current behavior

that we might need changing?

And this really allow us to have
a bit of psychology of users.

The second thing is way more closer to
product where, you actually do proper

user testing of your product and you
see how people are interacting with it

and understand it closer to the thing
you're building, how they're interacting

with it and, what's happening there.

Majority of the time there's
huge incomprehension.

That's where we start.

People don't actually understand what
your product does what to tap, where to

click, they don't understand any of it.

They're just like this very confusing
experience and a lack of consistency

between what they've seen maybe on
the ad that brought them there versus

what they're experimenting on the
product and generally like a lack of

understanding about what's in it for them.

And then on top of that,
we ask them a lot of stuff.

We ask them, give me notification
permission, give me location permission,

give me your money, give me like whatever
is it we ask them a lot of stuff.

So we ask a lot of commitment from them,
but we're not giving them a lot and start

all put together make them drop off.

Plus a lot of, other things like
life, like I don't have a lot of

time, I have other products, I
have other things to do, et cetera.

So then you go and talk to people and you
put two and two together and come up with

a bunch of hypotheses around what you
could be testing and those hypotheses,

they don't need to be small hypothesis.

It can be big things.

It can be, you know, we believe
that if we build a new product,

actually we will solve that problem.

And once you have those high level
hypothesis, you go in and you try to

think about how can I validate that?

And then that essentially leave you a
list of ideas that you can go on and test.

And you can classify them, there's
like different way of classifying them.

Like people like to do high
impact, low effort first.

I think that's arguable, but anyway,
like you classify them and then at

that stage again, you need to set
like what success going to look like

and then you go on and you do it.

Christian: What you said there,
I think it's worth highlighting

again, which is you don't come
up with ideas for experiments.

You come up with hypothesis first, based
on some sort of research that you're doing

or understanding of customer discussing
with them, or perhaps even based on data.

And then the experiment is
what you come up to validate

or invalidate that hypothesis.

It's not like you come up with ideas of,
Oh, we should put a button here because.

Because what, right?

It's the hypothesis is that if we
give people I don't know, a way of

pausing their membership rather than
cancelling it, they are more likely

to take that, therefore, we won't have
to go after them to convince them to

resubscribe or something like that.

The idea for the experiment
comes because you have the

hypothesis, Doesn't come first.

So I like that and I just
wanted to, highlight it again.

Last question I have on experimentation.

You might join a company.

And they are not necessarily
the mindset of experimentation.

They don't understand what it is.

They don't understand
what's the value of it.

They think it's too risky.

They just want to put big
features out there and then

just run the company like that.

And perhaps there's a place and a
time for that, but you might come

in and you might say at the point
in the company we are at right now,

experimentation is what we need.

How do you then convince people who
are risk averse and they just, they

just don't understand that world?

How do you then at least start
convincing them and start showing

them the way that experimentation is
the way to build a better product?

Lea: I love that concept because for
me, experimentation is for people that

are risk avert because it validates
what we're doing, so if I was a big

risk taker, I would go and have this
big crazy idea and just go and build

it and launch it and see what happens.

But I'm not, so I'm building
it in a way where I'm going to

validate it with experimentation.

So it's way safer from a business
perspective than the other way around.

So for me, it's really interesting
that, that's a perception.

Christian: I think it's the
way you frame it, right?

Because some people might say, Oh,
if we put a lot of MPVs out there,

a lot of experiments might scare
people away, might drive people away.

But I like the way you framed
it, which is, it's actually less

risky than doing bigger bets.

Lea: Yeah, absolutely.

Plus experimentation allow you to have
things on a feature flag and turn it off.

If it doesn't work, it's
way less risky in practice.

How do you convince them?

There's different way of going at it.

The one way is look what
other people are doing.

Pick, a bunch of very
successful company out there.

Big majority of them are experimenting.

Actually, I'm not sure there is
a really successful company out

there that is not experimenting.

A vast majority is, they're doing
it on MVP and non MVP as well.

When they're releasing new big
feature, new big experience, they're

doing it in a way where they are I'm
measuring the results and usually

you do that by split testing or by
releasing to only, one country first

or one part of the population first.

Especially if you have network effects or,
anything like social media or dating, then

you even need longer time to get results.

So those companies tend to
do like a percentage of the

population for a very long time.

So that's like how one
way of going about it.

Find a bunch of successful company,
put it together, show an example.

That tend to work quite well.

Secondly is looking at what
we've done in the past.

Even without experimentation, there's
always a level of understanding on things

that we may have built and released and
failed with, like you can go back and

see that we spent eight months building
whatever, and actually we can see that

only 1 percent of people are using it,
or we can see that whoever is using

it has no difference in retention,
monetization, like whatever are those

metrics you care about versus other cohort
of people that would assume you have

decent data, which often those companies
don't have, but let's assume you do.

So you can go back and you can literally
say, actually, if we were to have

experimented with that we would have saved
the company X amount of time, X amount

of money, and you can quantify that.

You can quantify that with people
time with investment, with revenue

with actual growth, which is something
that really helped putting things into

numbers, especially depending who your
stakeholder are and how they think.

The third part is convincing
the team to, to do it as a test.

You experiment with
experimentation, if you may.

And for that, you, you need to possibly
use the simplest possible tool out

there because we don't want to have a
big, you know, engineering integration.

Maybe we already have some things that
are used for release that we could use

to like split test or maybe we just do
it outside of the product first as well,

to see if it works, like on ads much
easier to test cause it comes with a tool.

And that first test, it's very
important that it has results.

It can be really good results
or it can be really bad results.

That doesn't matter, but it needs
to have a very strong results.

So you can actually show the value of
it because with experimentation, you

know, when you do it a lot, you'll
see that a lot of the time you don't

really get any results at all, right?

There's no significant difference.

And usually that means there's no impact.

And then it's up to you to decide if
that's a good thing or a bad thing

and then what's the decision there.

If your first test have no impact, it's
going to be really hard to convince

your stakeholder that, you need to
invest the time and effort to do

that when actually shows no impact.

So that first test you're doing,
make sure that it's something

that is going to be strong.

So that is something very top of
the funnel that will have a lot

of people that go through it.

So we get those numbers really quickly
that the change is not tiny, that it's

big enough, not too big that you can't,
understand where the change come from, but

big enough that it will have an impact.

And ideally the impact, we want it to
be on something that's very tangible.

So I love to do monetization tests because
you get really quick results and you can

very say, essentially we make twice as
much money if we do that versus that.

And we only know that
because we did a split test.

And I think, yeah, I think those are like
the three things that you could do that

works quite well at least what I've seen.

So take external example of very
successful company and how they work look

at your own ways of working in the past.

Tangibly measure how much you've lost
by not experimenting or by committing

to a decision that now retrospectively,
you know, was a wrong decision.

And third thing a validation of the
experimentation framework by doing a mini

test, out of the product on something
that's small enough that it can be built

quickly, but big enough that it will
have an impact and a very tangible impact

Christian: topic I want to cover
is perhaps slightly different

than experimentation, because when
people think experimentation, they

oftentimes also think optimization.

Which is changing small things here
and there, copying a button, a color

here, a color there, whatever it may be.

But I know you also believe that bigger
changes, or rather, changes in UI are also

valuable for the business, not necessarily
only small optimization here and there.

And, we talk a lot about the value of
design from a perspective of business.

We can move metrics, we can reduce
costs, we can do all of these things

through design, but you also believe
that the UI of it, which is something

we don't often see that it has business
impact, you actually believe that

the UI of it, that UI side of design
also can have a business impact.

Let's talk a little bit about that.

Lea: Yes.

I don't know if that's controversial
or not because it's it's more

of a softer metric to measure.

It's not as tangible to measure, but
I believe that we're in a world where

things are more and more competitive.

That the threshold to entrance is
higher than it's ever been in tech,

and that things that were acceptable a
few years ago are no longer acceptable.

And that means that the general population
is now used to a certain standard.

And I actually, I found that a bit funny
because it's from the private industry.

We used to a certain standard
from the private industry.

When you look at public industry,
you're very different, but

that's a topic for another day.

that means that there's a certain level
of things that you have to be doing before

we even talk about, what your product
is solving, how good is your product,

et cetera, there's just like a minimum
standard of entry that include things like

bugs, product stability, product speed,
and I believe you also include the UI,

the general look and feel of your product.

I believe consistency in UI and
quality of UI is directly linked with

trust and it's how you create trust
with your brand, with your product.

And trust is a necessary step
for people to come and use your

product to pay for your services.

Let's say your UI is a bit more
maybe lifestyle driven, like

Airbnb, for example, or Apple.

I do think it has an impact
on the end user as well.

I think it has an impact on how you
can price your product, on how you can

position it, on who you attract, on
how people talk about your product,

so on virality of your product.

It's harder to measure because
UI is usually something that when

you change in a product, you can't
necessarily change as a test.

And it's not just within the product.

There's a lot of brand that
comes around and a lot of product

perception that comes with it.

But it's easy to measure
the other way around.

If you look at a product today that
have What I would call a shitty

experience, really, that have, a really
poor UI, like a lot of different UI.

It's just, it's very inconsistent.

It's hard to use.

There's stuff popping in there and there.

Automatically, as an end user, you
lose your trust with that product.

You think this is a bit spammy.

This looks like, you know,
the internet of the 2000.

Like, This is not something
I'm okay with today.

And I think you're a lot less likely to
actually put your money into this product.

And that's something that we did
test and that we can test because

we see that product, especially
in the B2B world, I think in SaaS

products, you really see that, right?

Like the one that I'm doing really
well right now, like the Miro of this

world, they have really good UX and
they have really good UI as well.

Christian: I was thinking as you
were speaking, I love my analogies.

So I wanted to come up
with an analogy for this.

are looking to buy a car.

And two people, two different people
are selling exactly the same car.

And you just go to see both of them.

And one of them is spotless.

It's shiny.

It's just been washed.

It's clean.

It's just in perfect condition.

And the other car, exactly the
same car, it's hasn't been washed.

It's dirty.

It has flies all over it,
windshield and all of that.

I think naturally you're going to
think that's a less valuable car

because it's not taken care of as much.

I think it's slightly similar, you open
up an app and then whether consciously,

probably us as product people, we do
this a bit more consciously, but non

product people, they do it subconsciously.

I believe they look at it and there's just
something that feels good about a product

that is crafted well, that is well thought
out, where the UI and everything is, the

details have been implemented well, I
think when you open an app like that, it

just makes you want to, sometimes makes
you want to give their money just because

it's, it seems like a quality product.

And if, if they're putting so much
care in the UI, you also think they're

probably also putting a lot of care
in whatever is behind it versus.

It's like getting on a plane
and the plane is dirty.

We think what else is not
working on this plane?

Are they also not taking care
of, the engines or the landing

gear or whatever it may be?

. Lea, we've been here for an hour,
roughly, and the tradition that I have

on the podcast is at the end, I ask every
person the exact same two questions.

I'm going to change the second
one a little bit for you

because you're not a designer.

That's fine.

But the first one is the same.

It's a what is one action that
you think led to your success

that perhaps in one way or another
separated you from some of your peers?

Lea: I throughout my career, try to
actively learn from other people.

So I went to I was living in London,
which is obviously a place where

there's a lot happening tech wise.

So I went to every single event out there.

I'm still going to a lot of event.

I read a lot of blogs back then it
was like there were a lot of long

form content, maybe now it's a bit
more podcast and videos, but back

then it was more long form content.

So I really actively try to
learn and absorb as much as

I could from other places.

And then I would come back to my
workplace with some sort of a plan about

how can we action some of that, I would
always make something with it as in.

think I remember the first time I came
across the whole Northstar concept.

I was at a product tax conference
in Lisbon many years ago.

And it wasn't necessarily the first
time I came across it, but it was

the first time that I thought it was
explained in a way where it made sense

to me and I could see the value in it.

And so I came back to work and start
thinking, how can we apply that?

How can we shape that in a way
that would help our business grow?

In a way, doing that, I think, enabled me
to be maybe not ahead of the trend, but

like following up, keeping up with the
trend and trying to constantly bring my

place of work in that shape of constant
innovation and new ways of working and

adopting this thing as much as possible.

looking back at it now, I had no
understanding of that back then, the

first place I worked in, in, in product
for me was it wasn't a big company.

It wasn't a big name.

And I didn't necessarily think
that we were doing things well.

I didn't necessarily think that
we were great at best practice.

And then looking back after
everything I've experienced.

We were actually pretty good.

you know, And the main reason for that
was because me and my team, everybody

there was actively learning from
everything else we were exposed to and

trying to bring it back to a business.

And we had the space within the business
as well to change things like we were

empowered to actually put those things
into place and change things and take

those risks and that shaped my career.

I think that accelerated my career a lot.

And that allowed me to be where I am
today, where actually now I'm still

learning a lot, of course, and I think
I'm, I will be learning till the end

of days, but I'm also in a place now
where I can almost, teach and pass on my

knowledge to like the next generation.

And that, that's pretty cool.

That's a nice, like
circle of life situation.

Christian: I love that advice
there uh, because I think

it's so overwhelming today.

There are so many places from
where you can learn their podcasts,

books, courses, documentaries,
conferences uh, roundtables.

There's just so much content out there.

And I think consuming that content is
great, but consuming it with the purpose

of applying what you're learning at work,
I think that's really where the key is.

So thank you for that.

The last question is, what are
we not talking enough about

when it comes to products?

Lea: Sometime I like to take a bit of
a step back and think about what the

product will work on, what's the impact
it has on the world, on people, and it's

not something we do very often because
I'm, at least me I'm very into the day

to day of things, we go deep into this
problem and all those metric and KPI and

things we need to change and we go very
deep into all the mathematics of thing

or maybe the pixel thing in, in your case
and, and we think about the user as I

don't want to say a number because, we
talk a lot about the psychology of the

person and how we can impact that person's
life and how we can fit with their life.

So it's not really a number, but
we think about it as something

we need to get on our product.

It's a measure of success, right?

The more user you have on your
product, the more engaged they

are, the longer they're there.

It's a measure of success
to how good your product is.

And sometimes I think when you step
back and you look out the impact of your

product on the world it's astonishing.

And it's not something I had the
occasion to think about very often.

But when I do, I'm always amazed at
the fact that now tech has an impact

on the world, it goes way beyond
our own, number and KPIs and metric.

We have an actual impact on the world.

We're shaping the world of tomorrow.

And that's very scary because there's
very little regulation in what we do.

And I I can give you a very
tangible example of that.

I worked on a product that
had 600 million people on it.

600 million people use a product.

So when you make a decision, when you
change an algorithm you're actually

impacting 600 million people's life.

I've worked on products that have a
huge health impact where the product

actively prevented people from
committing suicide, for example.

I've worked on products
that shape people's careers.

So the entire workforce of one
generation Somehow has been impacted

by the decision that we made on how
are we going to present the job?

How are we going to build recommendation?

How are we going to help
someone build a career?

And a lot of it is like algorithm based,
is UX based it's tiny day to day decision.

But if you look at it, if you step
back of it, it actually impact

people life in a way will have a
butterfly effect on the entire world.

And in many way I think working in tech,
working on product the way we do is

extremely powerful and with that comes a
certain level of responsibility as well.

And that's not something we talk
about very often because I don't even

know that we realize how impactful
our work is on the world really.

Christian: Thank you.

What a great note to finish on.

I really love that answer.

if people want to find out more about
you, keep up with all these places, the

conferences that you're participating, all
the podcasts you're doing and everything

else, where should they follow you?

Lea: Uh, LinkedIn.

LinkedIn is a place where I am
the most active You can also

find me at many conferences

.
Christian: Perfect.

Leah, thank you very much for being
part of Design Meets Business.

Lea: Thank you.

Christian: If you've
listened this far, thank you.

I appreciate you and I hope you've
learned something that makes you just

a little bit better than yesterday.

You can check out the show
notes on designmeetsbusiness.

co.

If this has taught you anything,
please consider leaving a review

and sharing the episode with someone
else who could learn from it.

And I'll catch you in the next one.

Creators and Guests

Christian Vasile
Host
Christian Vasile
🎙️ Host & Growth Product Designer
Deep Dive Into Experimentation With Lea Samrani (ex Bumble, Badoo, Uptime)
Broadcast by