CRUZ: Are you
familiar with the report that was released yesterday from Veritas, that included a whistleblower from within Google that
included videos from a senior executive at Google, and it
included documents that are reportedly internal PowerPoint
documents from Google.
GOOGLE: Yes I heard about that report in news.
CRUZ: Have you seen the report?
GOOGLE: No I did not.
CRUZ: So you didn't review the report to prepare for this
meeting?
GOOGLE: It's been a busy day and I have a day job which is
Digital Well-being at Google so I'm trying to make sure…
CRUZ: Well I'm sorry that this meeting is impinging on your day
job.
GOOGLE: It's a great opportunity thank you.
CRUZ: One of the things in that report and I would recommend
people interested in political bias at Google watch the entire
report and judge for themselves, there's a video from a woman
Jen Gennai, it's a secret video that was recorded, Jen Gennai as
I understand is the head of responsible innovation for Google.
Are you familiar with Miss Gennai?
GOOGLE: I work in user experience and I believe that AI group is
somebody is somebody that works on AI principles. But it's a big
company and I don't work directly with Jen.
CRUZ: Do you know her or no?
GOOGLE: I do not know Jen.
CRUZ: As I understand that she is shown in the video saying, and
this is a quote,
"Elizabeth warren is saying that we should
break up Google. And like I love her, but she is very misguided.
Like that will not make it better. It will make it worse.
Because all these smaller companies who don't have the same
resources that we do, will be charged with preventing the next
Trump situation. It's like a small company cannot do that." Do
you think its Google's job to quote, "prevent the next Trump
situation?"
GOOGLE: Thank you senator. I don't agree with that. No sir.
CRUZ: So a different individual, a whistleblower identified
simply as an insider at Google with knowledge of the algorithm,
was quoted on the same report as saying, Google is quote,
"bent
on never letting someone like Donald Trump come to power again."
You think its
Google's job to make sure quote,
"somebody like
Donald Trump never comes to power again?"
GOOGLE: No sir I don't think that is Google's job and we build
for everyone including every single religious belief, every
single demographic, every single region, and certain every
single political affiliation.
CRUZ: Well I have to say that certainly doesn't appear to be the
case. Of the senior executives at Google, do you know a single
one that voted for Donald Trump?
GOOGLE: Thank you senator. I'm a user experience director and I
work on Google digital well-being, I can tell you we have
diverse use…
CRUZ: Did you know of anyone that voted for Trump.
GOOGLE: I definitely know of people that voted for Trump.
CRUZ: Of the senior executives at Google.
GOOGLE: I don't talk politics with my workmates.
CRUZ: Is that a no?
GOOGLE: Sorry is that a no to what?
CRUZ: DO you know any senior executives, even a single senior
executive at the company that voted for Donald Trump?
GOOGLE: as the digital well-being expert I don't think this is
in my purview to comment… I definitely don't know…
…
CRUZ: Let's talk about one of the PowerPoints that was leaked.
The Veritas report has Google internally saying "I propose we
make machine learning intentionally human centered and intervene
for fairness." Is this document accurate?
GOOGLE: Thank you sir, I don't know about this document so I
don't know.
CRUZ: Okay I'm going to ask you to respond to the committee in
writing afterwards as to whether this PowerPoint and the other
documents are included in the Veritas report, whether those are
accurate. And I recognize that your lawyers may want to write
explanation, you're welcome to write all the explanation that
you want but I also want a simple clear answer is this an
accurate document that was generated by Google. Do you agree
with the sentiment expressed in this document?
GOOGLE: No sir I do not.
CRUZ: Going to read you another, also in this report, it
indicates that Google according this whistleblower, deliberately
makes recommendations if someone is searching for conservative
commentators, deliberately shifts the recommendations so instead
of recommending other conservative commentators it recommends
organizations like
CNN or MSNBC or left leaning political
outlets. Is that occurring?
GOOGLE: Thank you sir, I can't comment I can't comment on search
algorithms or recommendations given my purview as Digital
Well-being lead. I can take that back to my team though.
CRUZ: So is it part of Digital Well-being for search
recommendations to reflect the where user wants to go than
deliberately shifting where they want to go?
GOOGLE: As a user experience professional, we focus on
delivering on user goals. So we try to get out of the way and on
the task at hand.
CRUZ: So a final question, one of these documents that was
leaked explains what Google is doing and it has a series of
stamps, training data, collected and classified, algorithms are
programmed, media are filtered ranked and aggregated, and that
ends with, people, parenthesis, like us, are programmed. Does
Google view its job as programming people with search results?
GOOGLE: Thank you senator. I can't speak for the whole entire
company, but I can tell you that we make sure that we put our
users first in design.
CRUZ: Well I think these questions, these documents raise very
serious questions about political bias.