This event happened on Feb 23rd, 2023.
In this edition of LisbonSEOMeetup we debated about the impact IA search/chatbots may have in SEO, both in terms of how the may make searchers change their behaviour, in terms of how companies create content and in terms of how we SEO professionals will have to adapt to this potentially disruptive technology
Video recording: IA search and SEO: Will ChatGPT, Sydney, Bard and “their friends” steal our jobs?
Here is the video recording of the meetup, in case you missed or in case you want to watch it again:
Video transcript
Here is the video transcript of the meetup (it was automatically generated so it may contain errors)
00:00:00.000 –> 00:00:00.990
It’s from today.
00:00:02.650 –> 00:00:12.900
Ana Verissimo: Okay, we have the recording. Just one month from today. We have another meet up. Just so everyone to know we’re gonna it’s gonna be hybrid. So it’s going to be for those in Lisbon. If you can join us
00:00:13.170 –> 00:00:16.850
Ana Verissimo: in Selena. it’s going to be about all auditing
00:00:17.020 –> 00:00:20.540
Ana Verissimo: and it’s going to be Monday presenting.
00:00:20.780 –> 00:00:31.640
Ana Verissimo: So you can also join online. The information is going to be on the website. We’re going to email you. But just a heads up. It’s exactly one month from today. At the same time. if you can join
00:00:33.090 –> 00:00:39.590
Diogo: perfect. So i’m just gonna read. We just
00:00:39.920 –> 00:00:59.550
Diogo: we just talked about this, this meet up, and I I did had some notes I prepared for one of my podcasts, and so i’m gonna read those notes which talks a little bit about the new bing that has then
00:00:59.550 –> 00:01:09.480
Diogo: Chat gpt in it, and also some of the information we already know about Burt. Wet? Right?
00:01:09.860 –> 00:01:15.300
Diogo: Yes, I mean, yeah, it’s part right. Please. Someone to correct me.
00:01:15.370 –> 00:01:19.110
Nair Dos Santos: Bard.
00:01:22.420 –> 00:01:51.570
Diogo: anyway. So the idea is that we i’m just gonna read this that text through. It was originally in Portuguese, so I had the chat gpt translated just for fun. So let’s see what what comes out of this. And just to say this text was written 2 weeks ago, so it has it might have. There might be a new information. So please correct me or let’s talk about it later, after
00:01:51.590 –> 00:01:55.250
Diogo: I go through the text, Does anyone want to
00:01:55.580 –> 00:01:58.270
Diogo: add anything before I start reading?
00:01:58.460 –> 00:01:59.280
Diogo: No.
00:01:59.730 –> 00:02:07.860
Diogo: Okay. So this is just then a little bit of a summary about the new bing and bard.
00:02:07.920 –> 00:02:18.110
Diogo: There we go now. I have the text in front of me, and I started with Bing. So then, you being. That’s how Microsoft decided to call it.
00:02:18.110 –> 00:02:45.830
Diogo: And it’s funny. It was a conscious decision not to create a new brand or a new product, and that was. That’s interesting. So what they started to. So what they actually did, and there was, according to the CEO of Microsoft, which I don’t have the name here, but it was quite clear for them that they just wanted to do a a facelift to to being guess. They don’t have anything to lose right.
00:02:45.860 –> 00:02:53.780
So, according to Microsoft, the system has been. They have been working on the system for 3 years
00:02:53.780 –> 00:03:13.650
Diogo: already, which is a really good indication of how challenging this. This the system can be, although they’ve been working on and with Chat Gpt for for 3 years. Apparently something did happen on August
00:03:13.650 –> 00:03:30.570
Diogo: last year, and that was when they specifically decided to transform bank. So that’s also interesting, and also a a good idea for us to have that. How long it can take for Google, for example, or how how
00:03:30.570 –> 00:03:35.130
Diogo: delayed is Google on this. Maybe we don’t know right?
00:03:35.360 –> 00:03:56.270
Diogo: So more, this new being actually uses an improved version of Gpt. 3.5, which is better than Chat Gpt, according to to Microsoft. And it is something funny that they also asked the Microsoft CEO was that if this was the
00:03:56.310 –> 00:04:15.070
Diogo: this next generation model that they described as if this was actually the longer way to G. G. P. 4. Okay, which is another model that it’s supposedly has more advanced, or there’s more parameters that trained that that model way. More parameters.
00:04:15.070 –> 00:04:29.720
Diogo: apparently, but it it’s apparently it’s it’s not Gpt. 4, according to to to the interview more. This new being also has an integration with edge which you can already.
00:04:30.180 –> 00:04:44.660
Diogo: You can already try by having access to the beta version of the edge. If you’re not already in the has the if your account isn’t already on the white list to
00:04:44.660 –> 00:05:04.130
Diogo: have access to the tool. So if you want, you can download the better version of edge, and you’ll have that tool that tool allows you to summarize information on the web page. Create content for for your Twitter, for example, for a trail reply. It’s quite.
00:05:04.130 –> 00:05:20.710
Diogo: It’s quite funny. I can. I can show you I do have access to the tool. So if I can show you a few if you want later. But that goes a little bit away from from from the Bing. This is just some integration that they did with with Chat Gbt: also in edge. Another point that this
00:05:20.750 –> 00:05:27.970
Diogo: chat bing has is that it already displays ads right it it it is
00:05:27.970 –> 00:05:56.370
Diogo: already has some integrated form of ads, and, according to Microsoft, I think last week they announced that their new features coming. There’s also a new add types that are going to come to Chat Gpt, and that’s also something very interesting. Finally, for in the first 48 h, Microsoft actually got over 1 million people on the waiting list to try Shp.
00:05:56.370 –> 00:06:04.010
Diogo: Chat, Gpt, or or the new being actually after they announced the this this thing.
00:06:04.330 –> 00:06:34.270
Diogo: Okay, let’s go to Google’s part, which is the the the name Google gave to its chat version, and it was presented a little bit before the Paris live. But during the Paris life there was a little bit more information we could see. Unlike being this Google Bar, at least in the current state it was presented Doesn’t indicate any source of information. So it doesn’t say anything about where
00:06:34.470 –> 00:06:38.840
Diogo: got the the information from? Also.
00:06:38.840 –> 00:06:56.490
Diogo: we know now that Bing actually sometimes doesn’t get that information as well just comes up with information or sources. Right? So that’s something. Also, a very interesting second part is, according to Google, is actually way more efficiency efficient
00:06:56.490 –> 00:07:15.850
Diogo: in terms of capacity and response. So, instead of having a checkpoint, chip chat, gp, apparently I read somewhere, it has a a cost of $700,000 per day, or something like that. I don’t know how accurate that information it is, but I, it does make sense that has a lot of C. It uses a lot of cpus.
00:07:15.850 –> 00:07:44.310
Diogo: a lot of CPU to to to respond to to users, so I don’t know this. This might be a a a good advantage for for Bart also a Bart, unlike Bing Doesn’t, rely on an on a partner. Right? So, although being so, Microsoft relies on open AI, which is another company and an independent company, even though Microsoft invested millions.
00:07:44.370 –> 00:07:54.970
Diogo: But it is supposedly an independent company or association. I know, because they’re I think they’re more nonprofit kind of thing.
00:07:55.000 –> 00:08:06.750
So this this might be a disadvantage for being in a and a good. And and it’s an advanced for Google because it doesn’t rely on that on that partner. And lastly, i’m going to shut up because i’m talking a lot
00:08:07.020 –> 00:08:26.280
Diogo: after Google presented some some misinformation actually, before this, a Paris live there. There were a lot of scrutiny going on towards Google because it presented wrong information. And this is something
00:08:26.280 –> 00:08:27.770
Diogo: funny, because
00:08:27.790 –> 00:08:52.250
Diogo: i’m not sure. This applies to Bing, and we were just talking a little bit before we started today, and it is different, like we cross Microsoft Doesn’t have anything to lose, sort of right, and we sort of forgive Microsoft. We sort of like, say, okay, we know it’s chat, gpt. We know it’s not always correct. We’ve already have that context. But if it was Google
00:08:52.550 –> 00:09:07.490
Diogo: fucking up results that would be a little bit more harsh. I think people would be more harsh on Google because we do trust Google way more about the information. And that’s a little bit of the summary I had for this
00:09:07.920 –> 00:09:26.270
Diogo: 2 2 weeks ago. I don’t know. I most probably I miss some new information. What did I miss? Well, we know now that all the not all the sources in being are actually real sources, right? The system just comes up with sources sometime.
00:09:26.270 –> 00:09:31.880
Diogo: We know already that what’s happening before
00:09:32.420 –> 00:09:36.560
Diogo: the the Bing chat this new bing chat.
00:09:36.560 –> 00:09:56.210
Diogo: We know actually what the the the system is doing is transforming first the query of the user into a big query, and then it goes through the bing index, and only after it spits out the information, and it tries to use whatever it came up on that bing index. So that’s
00:09:56.210 –> 00:10:16.880
Diogo: really good for Seos to know, because it’s important then to be on Bing right? So that’s a a good tip. And yeah, what else does anyone know anything, and everyone can talk so it’s an open mic. So if you have anything to add that I miss, please do
00:10:18.790 –> 00:10:26.420
Diogo: just raise your hand. I can’t be the guy that knows everything like It’s not possible.
00:10:28.430 –> 00:10:29.850
Diogo: Come on, people, No one.
00:10:31.880 –> 00:10:33.200
Diogo: I just feel bad.
00:10:33.270 –> 00:10:38.260
Diogo: Thank you. Maybe you can. Well, we can first
00:10:38.380 –> 00:10:46.520
Christian Oliveira: talk, because I don’t know if everybody is aware of what Chat Gbt you are, and all this kind of
00:10:46.610 –> 00:10:55.400
Christian Oliveira: tools are exactly or or how they work. So from my understanding, which is not allowed also because I just
00:10:55.520 –> 00:11:06.030
Christian Oliveira: right some information explaining it. But basically what’s happening here different from what we had in SEO
00:11:06.100 –> 00:11:16.930
Christian Oliveira: before, because we always have a ways of generating, generating, ultimate content. Is that basically what they they are doing is like. They are creating a model.
00:11:17.170 –> 00:11:32.580
Christian Oliveira: which is something that is strain with the same parameters. And what that model does it’s just guessing which one should follow.
00:11:32.590 –> 00:11:43.420
Christian Oliveira: Okay, so it’s not like we we are like now, like chatting with this kind of model, and we might treat it as a person or a support or something. But it’s just like.
00:11:43.420 –> 00:11:57.030
Christian Oliveira: like. Imagine a function that you give. Give it a text, and it will calculate, based on a lot of information they have. What is the next word, I could put it put after the the text that was received
00:11:57.030 –> 00:12:16.090
Christian Oliveira: to make sense. Okay. So the the goal, or at least from what I read the goal of these models is to sound reasonable. So it’s like those kind of people that we all know that they they can explain you anything, even if they don’t know and sound very resettled. But if you try to dig a bit.
00:12:16.090 –> 00:12:33.810
Christian Oliveira: they may not know exactly what they are talking about. So that’s what’s happening now? Yeah, it’s it’s like they are over confident, and they sound over confident, and they they They talk like they know everything, but it’s not. It’s not like that. They don’t have this intelligence, or
00:12:33.980 –> 00:12:43.440
however, we want to call it, because we also don’t know what is intelligence, or how we work in in our brain. So how how exactly we can define that. But
00:12:43.850 –> 00:12:57.070
Christian Oliveira: so what they do is that? And that’s why. Sometimes the answer is, are not accurate because they are not exactly trained to be accurate. They are. They are trying to give it a
00:12:57.310 –> 00:13:14.650
Christian Oliveira: give an answer. That sounds reasonable. Okay. So they have like these millions of text that, like the other, was explaining, based on the websites books whatever. And of course, that information is not like.
00:13:14.650 –> 00:13:31.220
Christian Oliveira: selected in a way that you can say this website. The information is good, or this specific information is good, and this is but so they all. They just have all of that, and based on that, they they they they give an answer. And what what’s fascinating, at least for me, is that
00:13:32.590 –> 00:13:50.150
Christian Oliveira: we don’t know exactly what we might get from this. So it’s it’s it’s it’s getting getting a lot of bus and a lot of reactions, because it’s something that the technology technologically and
00:13:50.150 –> 00:14:04.500
Christian Oliveira: in terms of use. It’s like something that we didn’t have before, and that can have a lot of possibilities. Of course it has a lot of problems. But we are just in the beginning. So I think we can debate a bit about that, like
00:14:04.510 –> 00:14:13.910
Christian Oliveira: what are the implications of this kind of technology which right now it’s like something very. It’s like a toy. We we we are
00:14:13.910 –> 00:14:29.610
Christian Oliveira: testing it and trying to break it, and all all of the people are like trying to make it stupid things and all that. But what is underlying all of this is like potentially something big, because I don’t know. Imagine you you can
00:14:29.660 –> 00:14:32.510
Christian Oliveira: eventually, if if it’s like
00:14:32.810 –> 00:14:45.920
Christian Oliveira: good enough, you can. You can ask, ask the Board like to create a website and publish 1 billionarticles per day, or something like that, with information they found with some guidance, and that can change a lot how
00:14:45.920 –> 00:15:01.960
Christian Oliveira: information is generated, a consume classified by itself, and it it can open like a a new way of doing things that we don’t really have now, or at least not mainstream, so I don’t know. What are
00:15:02.010 –> 00:15:14.310
Christian Oliveira: your thoughts like anybody, please? If you are listening, and you want to say or you have questions. But we don’t, we don’t. We are not really experts on AI, so we cannot
00:15:14.330 –> 00:15:20.590
Christian Oliveira: Probably a lot of them. We are just talking and trying to debate. But if anybody wants to, or something, or to a a.
00:15:20.820 –> 00:15:30.860
Diogo: I think something wrong you can participate.
00:15:31.090 –> 00:15:39.170
Diogo: Is this chat Gpt: Going to replace search right like Google search, for example, right? Or is search, going to change
00:15:39.550 –> 00:15:42.270
Diogo: from the state where it is now.
00:15:43.900 –> 00:15:47.160
Diogo: into a more of it. Chat or AI
00:15:47.750 –> 00:15:52.880
Diogo: response. Petro, you want to say something
00:15:54.530 –> 00:15:58.050
Pedro Dias: I I don’t. I don’t think it is to be honest. I mean.
00:15:58.420 –> 00:16:06.760
Pedro Dias: we all know that, you know. Google wants to be your personal system for a long time. I think we all know that we we all can agree with that.
00:16:06.860 –> 00:16:09.400
Pedro Dias: What we are dealing with.
00:16:09.490 –> 00:16:12.240
Pedro Dias: and. as said by Christian.
00:16:12.790 –> 00:16:13.740
is.
00:16:14.200 –> 00:16:15.730
Pedro Dias: we are not dealing with
00:16:15.970 –> 00:16:29.600
Pedro Dias: intelligence. We are not dealing with sentence. We are dealing with an auto complete. Basically and yesterday that a complete was that my iphone instead of you know.
00:16:30.070 –> 00:16:34.910
Pedro Dias: And now we are like dealing with the
00:16:35.250 –> 00:16:39.240
Pedro Dias: something that we have never this experience before.
00:16:39.470 –> 00:16:41.850
Pedro Dias: and some of us
00:16:42.240 –> 00:16:44.630
are mistakenly
00:16:44.690 –> 00:16:46.330
Pedro Dias: taking it for
00:16:46.730 –> 00:16:50.890
Pedro Dias: intelligence that can create something new.
00:16:51.550 –> 00:16:56.200
Pedro Dias: Wh: what happens with this is like as an auto complete.
00:16:56.230 –> 00:16:58.640
Pedro Dias: which is what it actually is.
00:16:58.830 –> 00:17:06.450
Pedro Dias: It can only create what already exists. So it’s gonna my ship stuff that already exists, and present it to you
00:17:06.569 –> 00:17:12.339
Pedro Dias: in a way that if you don’t understand of the
00:17:12.450 –> 00:17:16.010
Pedro Dias: in depth of the around the
00:17:16.119 –> 00:17:23.060
Pedro Dias: issue that you you are dealing with. For example. a few days ago I was
00:17:24.390 –> 00:17:37.090
Pedro Dias: using Chat Gpt as I subscribe to the plus version. and I was using it to write me a script in our to pull data from search, console.
00:17:37.110 –> 00:17:38.730
Pedro Dias: and export it.
00:17:40.640 –> 00:17:46.440
Pedro Dias: and it’s used. or the wrong library when it through the script
00:17:47.520 –> 00:17:48.810
Pedro Dias: myself.
00:17:49.120 –> 00:17:54.450
Pedro Dias: as someone that deals with that on a daily basis. I spotted the error.
00:17:54.950 –> 00:18:01.490
Pedro Dias: and I, you know. But I would say that someone that doesn’t have the minimum understanding of
00:18:01.520 –> 00:18:02.940
programming over.
00:18:03.300 –> 00:18:06.880
Pedro Dias: You know anything around the language are.
00:18:07.060 –> 00:18:08.830
would not be able to spot
00:18:09.300 –> 00:18:14.400
Pedro Dias: the the mistake. So the thing is that as you, as you said, it presents the
00:18:14.830 –> 00:18:19.280
Pedro Dias: the, the, the, the an answer, or a solution
00:18:19.420 –> 00:18:27.830
Pedro Dias: in a way that it’s so confident of itself that it fools you in, in, in, in, in, in believing that
00:18:27.960 –> 00:18:31.410
Pedro Dias: the answer that is giving you is 100% correct.
00:18:31.420 –> 00:18:44.180
Pedro Dias: And for us as humans. Sometimes, when you don’t understand very much about the subject, we’ll say i’ll i’ll run with this, but there are dangers of running with the information that it’s that are that it’s given to you
00:18:44.520 –> 00:18:47.550
Pedro Dias: by something like this that it’s like just
00:18:48.040 –> 00:18:52.400
Pedro Dias: it. It just came out like in November like last year.
00:18:52.680 –> 00:18:54.890
Pedro Dias: and everyone is already like
00:18:55.050 –> 00:18:55.950
Pedro Dias: thinking that
00:18:56.320 –> 00:18:58.920
Pedro Dias: this can replace some
00:18:59.000 –> 00:19:11.160
Pedro Dias: people’s jobs or it can replace search. I don’t think it can replace search. I don’t know if you all guys like to use it for to find something that you really want like.
00:19:12.380 –> 00:19:13.150
Pedro Dias: But
00:19:14.060 –> 00:19:19.230
Pedro Dias: me as a user I want to be given a choice of
00:19:19.240 –> 00:19:20.970
a range of stuff
00:19:21.030 –> 00:19:32.380
Pedro Dias: that I can look at. and I can pick what I want rather than be giving something, unless it’s a fact. unless it’s something that is.
00:19:32.490 –> 00:19:35.210
Pedro Dias: you know, and disputedly
00:19:35.220 –> 00:19:39.520
Pedro Dias: like, how tall is the Eiffel Tower. I don’t care. I would just want that fact
00:19:39.680 –> 00:19:47.360
Pedro Dias: for so, for for this kind of information. That is factual, that is around.
00:19:48.090 –> 00:19:55.800
Pedro Dias: you know, universal truth. I’d say yes, maybe it can replace going to Google and doing a search, or
00:19:55.970 –> 00:20:09.790
Pedro Dias: because we already do that on Syria and Alexa, or whatever you know we call. So I I think what’s gonna happen is that we are gonna see this personal assistance become way smarter with this
00:20:10.080 –> 00:20:12.480
because they are going to be injected with this.
00:20:12.520 –> 00:20:15.620
Pedro Dias: I know, I think Gp: 3 uses like
00:20:15.960 –> 00:20:18.500
Pedro Dias: 150 millionparameters
00:20:18.620 –> 00:20:24.460
Pedro Dias: for to for for to to be trained. What is like Lambda, which is the
00:20:25.360 –> 00:20:31.470
Pedro Dias: under this by Google is like 440 million So it’s like magnitudes.
00:20:31.680 –> 00:20:42.560
Pedro Dias: bigger and and and probably that’s why some engineer of Google was freaking out a few months ago that it was like the I, the I is alive
00:20:42.660 –> 00:20:44.950
because it it fools you so well
00:20:45.070 –> 00:20:47.550
Pedro Dias: that that you that you believe that it’s alive.
00:20:48.660 –> 00:21:01.380
Pedro Dias: But yeah, it’s not so. These are the risks like I don’t. But you know that I don’t think it’s gonna replace such, for because of the reasons I already said. But I think we should all be aware of that. This is not
00:21:01.790 –> 00:21:04.770
Pedro Dias: intelligence or not something that’s gonna.
00:21:04.880 –> 00:21:05.680
you know.
00:21:08.870 –> 00:21:10.500
Pedro Dias: Do a
00:21:10.540 –> 00:21:15.070
Pedro Dias: you know, hand you something that is sometimes 100% correct.
00:21:17.170 –> 00:21:20.610
Diogo: It’s it’s funny, because because I totally I totally disagree
00:21:21.780 –> 00:21:36.710
Diogo: right like a I mean in terms of my experience. Well, not in totally. But I do disagree in this because it did replace search for me right. I I used to search for code all the time right? I I used to search like.
00:21:36.710 –> 00:21:43.600
I don’t know I I would. I would go to stack overflow. I would spend hours in stack overflow, finding so many of
00:21:43.930 –> 00:21:53.630
Diogo: trying to find some answers about my like wordpress Api, or my shitty Javascript skills. And
00:21:54.000 –> 00:21:56.230
Diogo: well it did replace.
00:21:56.380 –> 00:22:08.400
Diogo: And it did. It did replace Google for me in that sense, like all the searches, all specific searches. It completely replaced Google for me. And I.
00:22:08.700 –> 00:22:10.680
Diogo: Funny thing, I don’t think
00:22:10.720 –> 00:22:14.750
Diogo: factual questions. Chat. Tpt is good, at
00:22:15.450 –> 00:22:41.330
Diogo: which is funny, right? Because you think it’s it’s it’s a fact it is, but it takes so long for the fact to come up like I I that for that. I just prefer Google, for example, right? It’s the the the the featured snippet is there like it’s it’s quick. It’s fast. It’s there, but chat it takes so long to answer like a factual question. I don’t know, like I want to see like I want to know the how big it is. The Eiffel Tower right.
00:22:41.450 –> 00:23:01.590
Diogo: and it takes so long to write an answer, and if you do use bing it, it goes to Bing, and it has to weigh, translate it into a query, and then it spits out, and as it sucks right. But on specific things, on real specific things for me. At least, that’s that’s how I use it.
00:23:02.450 –> 00:23:09.630
Pedro Dias: The big, the Bing version right? You’re using. I’m using the the chat to put you on open AI:
00:23:09.740 –> 00:23:13.310
Pedro Dias: Yeah, I’m using my experience.
00:23:13.390 –> 00:23:17.740
Pedro Dias: but, like I, I agree with you in the example that I gave, that with my
00:23:17.870 –> 00:23:19.530
a script in our
00:23:19.580 –> 00:23:31.540
Pedro Dias: I I use it for this kind of question this kind of issues like I I use it the other day to write me a Jira ticket, for example, right with your ticket to implement, like, for example, last MoD in the in in
00:23:31.670 –> 00:23:50.510
Pedro Dias: in the xml. At my part then available after tag in some kind of so I I used it for this kind of red Meg, and it does it, does it really Well, it’s right. Me the the juror ticket, but it’s based on everything that exists like I. I just go there and then check for errors or mistakes, and then it it it’s done.
00:23:50.510 –> 00:23:55.680
Pedro Dias: But yes, for that I agree. I can use it. Most of people agree like
00:23:55.780 –> 00:24:05.130
Pedro Dias: we’re discussing on Twitter that you know that overflow is that with chatty Pt: because if you want stupid of code that can generate the of code in
00:24:05.320 –> 00:24:08.800
Pedro Dias: in in a much more. you know.
00:24:09.490 –> 00:24:18.590
Pedro Dias: Pay third way to your needs, then stack overflow because you have to go. We have to find out if that fits what you want, and then you have to modify the code, and then you know.
00:24:21.430 –> 00:24:32.310
Montse Cano @MontseCano: I don’t know. But you are you weren’t looking for code? Right? Maybe for code. That is easier. But I find that all the answers, or many of the answers that they have
00:24:32.360 –> 00:24:39.600
Montse Cano @MontseCano: search for on, on to chat. Gpt have not been factually right at all.
00:24:40.540 –> 00:24:56.250
Montse Cano @MontseCano: that that that has to be in my my experience, and i’m learning how to use it right, because, as I said it, it only came about last November, October, November last year, at least mainstream last last year in the last year, and
00:24:56.250 –> 00:25:08.920
Montse Cano @MontseCano: we are still learning how to use it. But I I I don’t really think it’s going to be it’s absolutely anything. If a if anything, is going to be helping us to search it there.
00:25:09.320 –> 00:25:14.060
Montse Cano @MontseCano: transforming the way we are doing well, definitely, not replacing.
00:25:14.110 –> 00:25:25.140
Montse Cano @MontseCano: I think we tend to be very negative. It happened with the right here. It happened with this here i’m happy with TV. It didn’t happen with
00:25:25.270 –> 00:25:29.880
Montse Cano @MontseCano: with email marketing as well email list that everything is dead
00:25:30.170 –> 00:25:33.090
Montse Cano @MontseCano: all the time that is really boring.
00:25:33.140 –> 00:25:49.820
Montse Cano @MontseCano: It really is very boarding. And now, when i’m seeing all this, all these wrong, answers I, I I it just. It just makes me have to be perfectly honest. It is I I I just can’t believe that people are relying on something
00:25:49.970 –> 00:25:52.650
Montse Cano @MontseCano: that is not getting them
00:25:52.860 –> 00:26:03.200
Montse Cano @MontseCano: anywhere anywhere. I mean, I learned in Chicago today. I was trying to less wise a few keep what? Quite a few key words, to be perfectly honest
00:26:03.280 –> 00:26:11.560
Montse Cano @MontseCano: that that we’re coming from internal search, and what I call it. Maybe I don’t know how to use it properly, but
00:26:12.700 –> 00:26:19.610
Montse Cano @MontseCano: I I didn’t get anything I didn’t get anything that they could definitely share with clients, or when they look at.
00:26:19.680 –> 00:26:20.440
Montse Cano @MontseCano: No.
00:26:22.690 –> 00:26:24.800
Pedro Dias: I think
00:26:25.700 –> 00:26:26.920
Ana Verissimo: go ahead with him.
00:26:27.110 –> 00:26:40.780
Pedro Dias: No, I just. I wanted to add, I in in all set up the the wrong guessers are everywhere right. Don’t. Run this in code. They are, in fact, they are everywhere that you use Chat Gp: for that was that. That was.
00:26:40.810 –> 00:26:48.880
Pedro Dias: That’s what I was saying initially that it’s not a complete and it cannot relive. You know, 100% on what it tells you. But
00:26:49.040 –> 00:26:53.860
Pedro Dias: looking for a in in the in the future, and for the application of it.
00:26:54.010 –> 00:26:55.300
that’s my point.
00:26:55.390 –> 00:27:00.700
Pedro Dias: It’s going to be much better applied to factual. You know what what’s factual and what’s like
00:27:00.720 –> 00:27:06.130
Pedro Dias: informational rather than anything else. You, I don’t see myself like
00:27:06.300 –> 00:27:25.600
Pedro Dias: using it to to plan my holidays like somewhere, or you know, to to to order me something from Amazon, or something like that, because it might order me the wrong thing. I I don’t, you know. I want to be given a choice. I want to be given a range of things that I can look at and decide on my own
00:27:29.160 –> 00:27:38.530
Ana Verissimo: the same. Yes, I was gonna say, like monthly. You were saying that maybe I don’t know how to use it, but i’m not getting the right answers, and I have the same experience as well in the same feeling.
00:27:38.630 –> 00:27:42.990
Ana Verissimo: Maybe there’s a way of using chat to be better that i’m not using it.
00:27:43.020 –> 00:27:51.810
Ana Verissimo: because again same thing for me. The answers I got for not correct, even though they sound like they looked beautiful from out from the outside, just like what is when we said
00:27:52.100 –> 00:28:05.740
Ana Verissimo: already that. But when you actually go into the details like I was trying to get structured data for recipes, for example. it looks great. But then I go Wait. There’s an ingredients that it’s not part of this recipe, but it’s here all nicely puts
00:28:05.930 –> 00:28:07.260
Ana Verissimo: like, and
00:28:07.470 –> 00:28:16.300
Ana Verissimo: it doesn’t even add the note like these ingredients was not, you know I was. I was giving it the URL like, Write me a recipe for today for these, and I was getting like
00:28:16.550 –> 00:28:26.410
Ana Verissimo: like everything looked right. But he was not right when you go into the details. and I was thinking, actually, maybe the difference, because you I know you have in a very different experience.
00:28:26.760 –> 00:28:34.650
Ana Verissimo: maybe because you are also very committed to make it. I mean to understand this to make it work to write it, and I know it is in a bad way. I actually mean it like.
00:28:34.840 –> 00:28:44.730
Ana Verissimo: Maybe the difference is, how committed. Are you to actually understand, you know, like the best way to, you know, to to interact to us. But there’s always going to be mistakes, but one.
00:28:44.960 –> 00:28:51.850
Ana Verissimo: You need to, maybe become more committed to actually make it work. And to be okay with the time you’re going to have to spend
00:28:51.930 –> 00:28:55.390
Ana Verissimo: just for any errors, because you also then have to support any.
00:28:55.430 –> 00:29:10.110
Ana Verissimo: No, there’s a human check that needs to be done, and maybe in the end that’s going to be the difference, and that’s why I think that people are bit hysterical, and they also I don’t think it’s going to replace search. I’m part of this team. I don’t think it’s going to change that much in the short term.
00:29:10.350 –> 00:29:16.230
Ana Verissimo: maybe because it’s, you know. Either you are committed to actually like either you are engaged.
00:29:17.300 –> 00:29:28.860
Ana Verissimo: or it’s not going to serve you because you go there. You try a few times. All this looks so funny, so nice. Oh, write me a text of 500 words about traveling to do whatever it’s it. It gives you like something that looks super nice.
00:29:29.160 –> 00:29:47.680
Ana Verissimo: It doesn’t go beyond that unless you are committed to go beyond that, maybe with all the errors that we’ve been talking about. So. And I think, yeah, if it requires a lot of efforts. People are not gonna go into it. It’s like, let’s all go to this tweet, or what’s the name? I even forgot. And I was in the app myself.
00:29:47.720 –> 00:29:54.450
Ana Verissimo: That’s the we place mess about. Thank you. He’s like. Let’s go to master. And he was difficult to understand.
00:29:54.850 –> 00:29:59.230
Ana Verissimo: No, I mean just the same. People probably stayed in, and
00:29:59.990 –> 00:30:01.610
Ana Verissimo: I don’t think he’s going to be.
00:30:01.750 –> 00:30:11.290
Ana Verissimo: I think there’s going to be impact on the way we search by. I think it’s going to be a more new one thing than oh, everything is going to be changing now. New worlds, new Internet. News of
00:30:11.510 –> 00:30:14.230
Ana Verissimo: I yeah, I don’t think as well. I think
00:30:14.570 –> 00:30:18.380
Ana Verissimo: it’s going to be a a slow progress there.
00:30:18.560 –> 00:30:20.950
Pedro Dias: You don’t go on. I don’t 500 new jobs.
00:30:24.950 –> 00:30:25.830
Diogo: Christian.
00:30:27.490 –> 00:30:32.540
Christian Oliveira: I I I also don’t think it will replace search
00:30:33.520 –> 00:30:36.700
Christian Oliveira: because of what has already
00:30:36.840 –> 00:30:37.480
Christian Oliveira: you.
00:30:38.260 –> 00:30:50.370
Christian Oliveira: and especially for complex topics like you, especially when you are learning something or trying to understand some more complex topic, or
00:30:50.450 –> 00:30:56.040
Christian Oliveira: even I don’t know if you even even maybe trying to buy a phone or something like that.
00:30:56.250 –> 00:31:02.600
Christian Oliveira: do you normally even users. Normally don’t go and click on the first search result.
00:31:02.640 –> 00:31:18.500
Christian Oliveira: and just definitely believe what? What? Said there? And and to that, of course, more people click on the first search as well. But normally, if you, if you watch you can, I do this exercise a lot when I, with friends or something, if you watch how people do
00:31:18.520 –> 00:31:28.140
Christian Oliveira: things like people normally open different sides. Compare stress more, some types because of how they are written because of the name behind.
00:31:28.140 –> 00:31:43.340
Christian Oliveira: There are a lot of a 1 million things in influencing that, and of course there will be some people that will believe everything. There are people that click on ads, and then there are people that do a lot of stuff. But in general I don’t think that we can
00:31:43.530 –> 00:32:01.180
Christian Oliveira: expect that these tools will replace. But I really think that this will be integrated. In fact, it’s already integrated, not as a chat, but, for example, when you search for news on Google and and it’s sometimes hot topics. I think sometimes you already get like this
00:32:01.270 –> 00:32:21.100
Christian Oliveira: made up head headlines of a different group grouping like new. So, for example, you search for a topic, and they automatically give you like several new species and a headline that I think it’s made. But by AI like explaining what the those things are about. So
00:32:21.100 –> 00:32:22.510
Christian Oliveira: these technologies that
00:32:22.730 –> 00:32:38.990
Christian Oliveira: we are seeing here will be probably use more a lot in some cases, like a lot of things that are integrated into search like video, Google maps and all that. Probably we will starts, maybe seeing that for some kind of queries and for some kind of
00:32:38.990 –> 00:32:52.870
Christian Oliveira: interactions. Maybe this will appear as an option, and things like that, but replace like people completely trusting something like this and changing their behavior completely. I don’t think so.
00:32:53.090 –> 00:33:03.080
Christian Oliveira: or at least not in the short term. The it will need to evolve really a lot to and to gain that to us that they don’t have now, but also I don’t think we expect
00:33:03.340 –> 00:33:13.700
Christian Oliveira: at at least the ones. The people I know that are using this. We we know it’s not working perfectly, so we don’t expect the that
00:33:13.700 –> 00:33:24.420
Christian Oliveira: it it’s always a a nice answer when you understand how it works. Also, you understand that it’s not perfect. So. But it’s revolutionary in the sense that
00:33:24.510 –> 00:33:36.880
Christian Oliveira: for the first time I think in technology a machine can talk to you in a reasonable way, because I i’m, I’m. Until now it’s not like that. It’s like automated messages. It’s all like pretty fabricated
00:33:36.890 –> 00:33:49.020
Christian Oliveira: by a human. And this is like a more. It can create something that you don’t really read with the start more in other places. So it will probably be integrated.
00:33:49.190 –> 00:34:07.040
Christian Oliveira: I mean, like, in that kind of sense, I think, but I don’t think that people will blindly start like, okay. I want to buy an iphone or a phone. Give me the best, and I will click and buy that. That will not happen. Maybe it’s integrated. And also maybe the the chat involved to give more options to the user I don’t know, but
00:34:07.040 –> 00:34:15.760
Christian Oliveira: it’s it’s like a very difficult now to to try to project anything like this, because it’s in the it’s in the beginning of
00:34:15.929 –> 00:34:33.860
Diogo: of the of the technology. Maybe we’re also looking at this as on the absoluteest side, wide in, like Fred, for example, Frederico in the chat here did ask something
00:34:33.860 –> 00:34:47.980
Diogo: really interesting, which is Don’t. You think that people by the end of this year may may be spending last time searching on Google Search Bar meaning, and I I think this is a a good idea in terms of okay. Maybe it doesn’t replace it completely.
00:34:47.980 –> 00:35:09.000
Diogo: But maybe people will also use more other tools and stop going to Google as much as they are. So this this so instead of just replacing it completely, maybe just replacing it a little bit at least I mean for for my experience, for from what I’ve been doing, for example, is
00:35:09.000 –> 00:35:19.250
Diogo: and I it’s it’s it’s weird, but I I do have an open AI chat window every single day open on my computer, and I do go to it
00:35:19.270 –> 00:35:28.400
Diogo: so many times. And and now i’m i’m i’m using bing as well a little bit, just to to try it out. But it it’s there that my habit.
00:35:28.590 –> 00:35:32.640
Diogo: you know. Now, now it’s it’s open there right and
00:35:32.900 –> 00:35:38.580
Diogo: maybe it doesn’t replace it completely. But a little bit. Can that be Christian?
00:35:40.210 –> 00:35:43.660
Christian Oliveira: I don’t know. I I I suppose, that
00:35:44.400 –> 00:35:49.460
Christian Oliveira: more people will use this as it’s available. But I don’t think
00:35:49.490 –> 00:36:01.960
Christian Oliveira: will be like I I I The question or the comment was that maybe at the end of this year I don’t think that the end of this year will be like something massive, like something massive in the sense of
00:36:02.220 –> 00:36:05.030
Christian Oliveira: I don’t know. Have
00:36:05.190 –> 00:36:11.610
Christian Oliveira: the people using Google. Now you see that regularly, or something like that, because
00:36:11.640 –> 00:36:29.150
Christian Oliveira: well depends. Of course, if Google integrates it into search, but it’s not replacing it’s like. If if it’s. If it integrates it in the sense we you will use if we use it. Of course, whenever it appears, or something like that, but voluntarily going there. I don’t think that will be so fast, I
00:36:29.150 –> 00:36:32.450
but it’s yesterday, I guess I don’t know I don’t have any.
00:36:32.780 –> 00:36:43.350
Christian Oliveira: It it will depend a lot on on the news, on what they do, and because. for example, my parents don’t even know what the disease and they use Google and a lot of people are like that, and
00:36:43.540 –> 00:36:53.450
Christian Oliveira: they will probably end up trying it. If Google, for example, integrates it, or if everybody everybody is something to talk about that. But
00:36:53.690 –> 00:36:56.310
Christian Oliveira: that’s slow, I think.
00:36:57.710 –> 00:37:11.380
Montse Cano @MontseCano: But also, if anything, we have to just go back to marketing, what is this marketing per se? And if anything, we have seen a lot over history that nothing has changed
00:37:11.400 –> 00:37:14.670
Montse Cano @MontseCano: people’s behavior quickly.
00:37:15.000 –> 00:37:33.910
Montse Cano @MontseCano: and there’s there has been something like a pandemic, nothing nothing that I actually makes then makes them change their behavior very quickly, because we tend to be. We don’t have any more how or how it’s right. So we feel comfortable in now we’re all happy to see. Suddenly we have to change
00:37:33.950 –> 00:37:39.420
Montse Cano @MontseCano: it that is moving us outside of our own comfort zone, and at least not
00:37:39.420 –> 00:37:58.890
Montse Cano @MontseCano: that that is, that is a very interesting point, right? Because in in in that sense you’re sort of also saying that Google does still have time to come up with its own version, if it needs to right, because the the user the the human, does still need time to
00:37:58.890 –> 00:38:02.400
acquire you a new behavior, right?
00:38:03.210 –> 00:38:12.740
Montse Cano @MontseCano: It’s interesting. Yeah, I think so. And the thing that they probably are they probably are at the moment. I mean, they don’t absolutely everything that they are doing.
00:38:14.420 –> 00:38:15.010
Diogo: Yeah.
00:38:16.060 –> 00:38:25.440
Diogo: Okay, should the Shall I read one of the questions that we have on on slide? Oh, so one of the question was
00:38:25.440 –> 00:38:49.120
Diogo: a week ago from Chat Gpt. Asking question, how can businesses, leverage, chat, gpt, and other AI power tools to optimize SEO efforts and navigate the opportunities and challenges they present? It does seem like Chat Gpt wrote this. I don’t want to offend anyone.
00:38:49.120 –> 00:38:50.370
Christian Oliveira: but
00:38:50.430 –> 00:39:07.910
Christian Oliveira: it sounds like full of but buzzwords. It’s like that, boss, that doesn’t understand what’s happening. But once the sound is marked, so it does the question. It doesn’t even know how to what the spec to do to answer. But it sounds good. It’s like, Wow! That’s so smart, so so
00:39:07.950 –> 00:39:16.440
Diogo: strategy and all of that. So so do you. Wanna do you understand it? Take a step at it. Some of the tools or how we
00:39:17.860 –> 00:39:20.320
Christian Oliveira: Okay. So
00:39:20.900 –> 00:39:25.830
Christian Oliveira: I think Well. if you, if you
00:39:25.860 –> 00:39:42.110
Christian Oliveira: see, and in the community a lot of people are doing a lot of things with this already in terms of SEO normally, for from what I see is more like an assistant, so you they treat chat division as an assistant for a small
00:39:42.130 –> 00:40:00.220
Christian Oliveira: tasks or tasks, where, like they. they are heavy, so chatg, but you may makes them faster. So I I have seen, for example, a lot of sharing of proms like what? How do how how to ask?
00:40:00.370 –> 00:40:06.860
Christian Oliveira: Then? It’s a duty to get a list of keywords, for example, and there’s a certain
00:40:06.900 –> 00:40:14.880
Christian Oliveira: parameters, but I think it’s more well. Probably you they can share some examples. But what I still was
00:40:15.110 –> 00:40:19.910
Christian Oliveira: for me the problem is that I cannot still
00:40:19.920 –> 00:40:20.920
Christian Oliveira: trust
00:40:20.950 –> 00:40:30.310
Christian Oliveira: the response. So I need to to, after getting it, go and check that the the things make sense and all of that. So it’s like
00:40:30.440 –> 00:40:45.760
Christian Oliveira: hiring someone which is free. Well, as you pay the to do some small tasks that’s like the level now. But of course you can do a lot of stuff you can start creating content with these. We don’t really know how it will
00:40:45.800 –> 00:40:50.250
Christian Oliveira: behave. But the you can.
00:40:50.260 –> 00:40:56.090
Christian Oliveira: The limit is your imagination in the sense that whatever you will ask, they will respond. I don’t know if
00:40:56.170 –> 00:41:11.900
Christian Oliveira: for every case they would respond with something good. But you can think of any task that your company does, and you can try and see if Chat D. But you can make it quicker better. Whatever the the paramet, it is
00:41:13.350 –> 00:41:25.010
Diogo: yeah, from from what I’ve been doing I’ve been doing a lot, and one of the latest tools I’ve I’ve created, was
00:41:25.620 –> 00:41:37.830
Diogo: it? It was also so I I I’ve I’ve first let’s start in January Last year. I already had used open AI playground to start writing some content.
00:41:37.850 –> 00:41:45.630
Diogo: so that the chat Gpt didn’t exist back then, but the open AI G. P. T. 3 was already accessible.
00:41:45.630 –> 00:42:15.480
Diogo: and I did wrote some content. It did rank for a little bit. It wasn’t anything special. I I didn’t know, and I use it more like as as you said Christian, to to write content. I use it more as an assistant of like. Okay, let’s see what’s important in this. Tell me what’s important in this type of query or in this type of content, right? And it did help me a lot. I I do remember it created one of the the queries were.
00:42:15.480 –> 00:42:18.310
and dogs Could dogs be vegan? So
00:42:18.640 –> 00:42:36.030
Diogo: yeah, so the it gave me some vegan recipes for dog food, which wasn’t w weren’t present anywhere, right? So it it it did sort of created, or it did made sense on the it’s auto-complete of.
00:42:36.030 –> 00:43:04.090
Diogo: I don’t know what it was. It was rice and and and beans I don’t know. But so with Google right people you say? Well, there’s a everything Isn’t Google, just Google it. We’ll be there, and there’s a website for everything. Yeah, the the idea here that’s a good point. The idea here was to create fresh, unique content, right? And and Google did in their presentation. They did mention this, that the
00:43:04.110 –> 00:43:30.780
Diogo: even their AI model they had, or the the chat and the chat model they had. So the the bard was good at Nora. Questions which was no one. Right? Answer. Right? There’s there’s no one right answer. So this this term is very interesting to sort of. I think this is sort of like to to that point of where it fails a little bit with With how
00:43:31.070 –> 00:43:50.080
Diogo: big or how tall is the Eiffel Tower Sort of questions? It. It does do better on these Nora questions when there, where it can just go on and on, and just sort of have some freedom on the the answer. There, there’s no one right? Answer, right. So
00:43:50.110 –> 00:44:15.240
Diogo: and yeah, so I I did wanted to create some some fresh new content, and I wanted to see if if first of all, if if the the the articles did rank. They were shitty. They didn’t rank that like they rank for a little bit, and then they just got away, so that that happened, so I think, writing content, and as an assistant it does help you sort of at least to
00:44:15.240 –> 00:44:21.990
outline a topic you don’t know much about. Right so. And I I think this trust thing is is very funny, because.
00:44:22.230 –> 00:44:41.690
Diogo: even though you guys were saying that I I don’t remember who mentioned it, that they don’t trust the machine that much on the information. But funny enough. When I was writing my first wordpress plugin through chat, Gpt. And I know 0 about Php: right? Which is the language. I did trust it
00:44:41.770 –> 00:44:59.090
Diogo: right. I did trust it, and I had to trust me. I had to see, like, okay, does this work? Let me see. I do have some technology knowledge, and I do have some knowledge, especially from javascript and it and it’s I do connect some dots. Of course you do need that knowledge, but I did trust it.
00:44:59.090 –> 00:45:06.360
Diogo: and I did publish my first plugin ever, and I’ve never published the plugin. I could never do it before, right? So
00:45:06.380 –> 00:45:12.940
Diogo: I did trust the machine in that sense. I don’t know if if that makes sense.
00:45:13.090 –> 00:45:36.800
Diogo: But regarding tools, I don’t wanna go away from this. I i’m sorry I do speak a lot, and please to shut me up. So one of the other tools I i’m. I I did create was with with also with Chat Gpt. It it allowed me to create was to grab all the images from a wordpress, website.
00:45:36.800 –> 00:45:55.930
Diogo: and all the images that didn’t have the alt text parameter. And then I had chat, gpt, describe every single image, and then I learned that Chip Gpt doesn’t describe an image that doesn’t have on a file. So then I went. Okay, then, then, describe me the image like based on the URL
00:45:55.940 –> 00:46:12.960
Diogo: where the image is, and then I give it the URL of the website. And because it can read HTML, it did give me a more concise alt text for that image. Of course the all text is a phrase right. It’s. It’s quite big, it is.
00:46:12.960 –> 00:46:42.770
Diogo: It’s not optimized in it, in a sense, but in terms of not having any all text. It does have information there, and so many times it does refer to the website, the cons, the context where it is, and the product and the color of the product. If that page is relating that color and then so chat, chat, chat, Gpt also allowed me to. Okay. Now create this Php. Admin. Query
00:46:42.770 –> 00:46:55.710
Diogo: to publish all these all texts in one single click, and it did so. All All texts were published on every single image that didn’t have that that value. So
00:46:56.150 –> 00:47:13.060
Diogo: this to say that it it can help you specifically with these tasks right, and there’s also a tool which is open. AI, so it’s a gpt for excel, which is amazing because it
00:47:13.060 –> 00:47:16.440
Diogo: you’re able to use the context of your excel
00:47:17.490 –> 00:47:29.170
Diogo: right? So it can create. You can use the context of yourselves, and then have the system complete, whatever context it had before. So if for every single sale you’re saying, remove
00:47:29.310 –> 00:47:34.930
Diogo: you can say, remove like an iphone or an underscore whatever, and then completes everything.
00:47:34.930 –> 00:48:04.840
Diogo: and it does that. So you don’t need to find the function to remove that right? So it’s just that simple. I don’t know if if I I don’t know if you you you guys would go through this, but so many times I have to work with data, and I do have to transform the information right. And I I would just have to go and and see which function if I had to use. An if then, then the right function, then the left function, the number of characters. So I have to construct all this query. And now I just do it with
00:48:04.840 –> 00:48:10.200
Diogo: Chat Gpt and a, or I mean with the with with Gpt 3. And
00:48:10.440 –> 00:48:30.820
Diogo: it just does that based on the context of what I’ve done previously for those cells. That’s freaking amazing. And so that’s that’s also really good to work with when you’re working with with data with the I don’t know if you need to transform your your descriptions or your titles. For example, you can do that with
00:48:30.870 –> 00:48:57.220
Diogo: way easier nowadays. So that’s what I’ve been working on as more of the tools, of course. Plugins word price plugins work really well as well to create with With that with chat, Gpt: I’ve been doing a lot of of that. Yeah, and i’ll shut up for a little bit. What about everyone else regarding the the plugins? How do you? How do you know that it doesn’t have some kind of
00:48:57.230 –> 00:49:01.730
Pedro Dias: huge vulnerability? Because it is some kind of weird library?
00:49:03.110 –> 00:49:06.850
Diogo: That’s that’s a good point. I I don’t.
00:49:07.450 –> 00:49:12.490
Pedro Dias: Oh, yeah, that that’s that. That’s one of the things that you have to be on the lookout as well
00:49:12.530 –> 00:49:25.890
Pedro Dias: on the on the excel one. Yeah, I see a lot of that in that one, especially because sometimes I don’t remember half of the formulas that I needed, and then it kind of. I just can ask you to rewrite me and do kind of you know.
00:49:26.160 –> 00:49:33.950
Pedro Dias: strange queries almost turning, excel into a database so that that that’s cool.
00:49:34.030 –> 00:49:39.760
Pedro Dias: Yeah, so. But to yeah, like I was saying like for it to write code.
00:49:39.990 –> 00:49:48.850
Pedro Dias: It’s I I’ve seen it fixing code. I’ve I’ve seen it like people asking, oh, find me vulnerability in this code, and it finds it, and it fixes it.
00:49:48.890 –> 00:50:01.320
Pedro Dias: Now I don’t know if it when you ask it to write, code. If it does that as well like we see those all the vulnerability as things you, you probably should go and ask it to find the vulnerability of your plugin.
00:50:01.320 –> 00:50:18.920
Diogo: Yeah, there’s there’s a lot of there’s a lot of prompts you have to to go around and and re prompt the the the information right. So that’s it. It’s it’s not as easy that at that. You just say like, Write me a plugin that does this, so
00:50:19.360 –> 00:50:27.360
Diogo: I mean it can do that, but at least I don’t trust it to just be like it. 100% working from scratch have to sort of like
00:50:28.430 –> 00:50:41.560
Diogo: prompt sculpt, whatever you want at the end, and then it’s like trial and error, right and and and test it out. But it’s something that I wasn’t. I wouldn’t ever be able to do this before.
00:50:42.050 –> 00:50:43.160
Diogo: And now I am.
00:50:44.810 –> 00:50:45.520
Diogo: You know
00:50:46.130 –> 00:50:50.320
Diogo: anyone else using this for any other tools.
00:50:52.850 –> 00:50:53.640
Diogo: No.
00:50:55.160 –> 00:51:05.050
Fernando Morgado: no; I just find find a value in the automation part, you know, because if you are knowledgeable of like the the errors.
00:51:05.300 –> 00:51:09.170
Fernando Morgado: you see that your process, if you are dealing with
00:51:09.320 –> 00:51:15.180
Fernando Morgado: a topic that you you don’t know as well, and it can be our full.
00:51:15.310 –> 00:51:17.410
Fernando Morgado: But but I I can see the develop.
00:51:19.010 –> 00:51:21.380
Diogo: Okay, Should we go to the next question.
00:51:21.640 –> 00:51:37.810
Christian Oliveira: Yeah, I think the if you let me be able. Because Frederica was, i’m saying here in the test, Prim, they go if you want to. Also speak. You’re more than welcome to to to join live here. And so he was commanding
00:51:38.090 –> 00:51:43.820
Christian Oliveira: about buying products and trusting the the machine so well that
00:51:43.970 –> 00:51:49.210
Christian Oliveira: from what i’m reading here that you were like. I’m decided about buying a
00:51:49.240 –> 00:52:07.320
Christian Oliveira: camera, and that it have you. So I think that the trust thing is like. For example, the people we are here. Probably we are like more advanced, or are working already. We are like a a minority in in the sense that we we also, we want it
00:52:07.320 –> 00:52:19.970
Christian Oliveira: to to work, or we we are committed. We did like we. This is a new technology. It’s like a toy, and we are like playing with it, and of course we we, we want it to work. But
00:52:19.980 –> 00:52:29.710
Christian Oliveira: when I was speaking about trust I I was like referring like that right now. There is no way that you are guaranteed with with
00:52:30.160 –> 00:52:47.120
Christian Oliveira: factual information. So you will need like to do extra checks like, for example, if I ask him about the product. I can not trust the information about the product, because it can be 2 or it can not. There is no guarantee like
00:52:47.120 –> 00:52:54.000
Christian Oliveira: you will get, for example, for a specialized. I don’t know photography blog that you will have been follow, so
00:52:54.000 –> 00:53:10.430
Christian Oliveira: I I know this is not but a lot of people don’t remote. That right? I I know, I know. But, for example, when you are going to spend a lot of money, there are some people that will go to a website and buy. Then that will happen, probably also with Chat Gpp, or any other thing like this. But
00:53:10.620 –> 00:53:18.370
Christian Oliveira: what i’m saying is that it’s more complex like. For example, this is related to another question we have here which I am going to read.
00:53:18.420 –> 00:53:36.800
Frederico Carvalho: We cheese. Would you consider blocking, which I d bit different Reading your website. That was that that I put with my baby to his sleep. I will give you back, but I have. Listen to what you have told me. She was sleeping, and I was writing. She will wake up.
00:53:36.860 –> 00:53:41.960
Frederico Carvalho: and when I came back I will write to you. Yeah, but I understand what what you have said.
00:53:42.380 –> 00:53:49.510
Christian Oliveira: Okay, so. But I I will keep answer. So I I just wanted to read another question here, which is from Miguel R. By Joseph.
00:53:49.710 –> 00:54:04.700
Christian Oliveira: So he asked. Would you consider blocking Chat G from reading their website, content as it might be still in your visitors without giving proper credit. And I think this is like a really hot topic, because, for example, in a product like that, imagine that
00:54:04.930 –> 00:54:22.960
Christian Oliveira: we, we we a so much in this and chat? Dbd. Indeed replaces or gets more fraction that Google or any other, but like that, the information they give you is based on the what they have in their in their
00:54:22.960 –> 00:54:38.380
Christian Oliveira: in their large model. Okay. So imagine there is a new phone. They will only have the information. If they get new information about that, like, for example, imagine that there’s a new iphone iphone 20, for example, and they will get, for example, imagine that everybody blocks
00:54:38.380 –> 00:54:45.150
Christian Oliveira: their content for these kind of robots, and they get the information on their model
00:54:45.200 –> 00:55:02.270
Christian Oliveira: about the characteristics, for example, of the phone by by apple. But you we you don’t really care about the characteristics you care about the experience of the people with the phone if it’s fast, if not, how is it with regarding charging, and all of that information needs real people trying. The product needs a a lot more
00:55:02.270 –> 00:55:09.600
Christian Oliveira: than just some text that makes sense, and and that’s why I think we are a long way from this, because
00:55:09.600 –> 00:55:29.200
Christian Oliveira: if, of course, if they got the information, they they may get to a point where you can go and say, okay, I want to buy a new phone. I have a 100 Us. Which one is the best I I am that that kind of user that just take photos, whatever you can explain him, and he will make the recommendation. But they need this information. So
00:55:29.200 –> 00:55:30.660
Christian Oliveira: if people
00:55:30.910 –> 00:55:43.710
Christian Oliveira: who nowadays great content, not because they love to create content and share for free, which was like in the beginning of the Internet right. Now, what whatever website that create this content like this, like really in depth.
00:55:43.710 –> 00:55:59.320
Christian Oliveira: it’s expected to get money from it, expecting to get money through us, expecting to get money through affiliates wherever the the the monetization model they have. So if suddenly this started to be a thing and and and and
00:55:59.410 –> 00:56:16.330
Christian Oliveira: that’s it. That information is no longer re rewarded indirectly with this kind of money, people will stop creating it, or will completely block, if that’s a thing in the future, from from from this kind of what in my case, for example.
00:56:16.330 –> 00:56:25.180
Christian Oliveira: if I I I have, I have. I have like really good content that takes a lot of time, and I see that, for example, these boots start.
00:56:25.380 –> 00:56:38.040
Christian Oliveira: I don’t know using it, and I stopped losing money because of this. Of Of course I will block it if it’s an option. And if is that something that can allow me to get my money back, or something like that. But
00:56:38.460 –> 00:56:41.390
Christian Oliveira: I don’t know it’s. It’s very complex, because
00:56:41.750 –> 00:56:44.460
Christian Oliveira: that’s it that like especially for
00:56:44.600 –> 00:56:56.220
Christian Oliveira: e-commerce buying all of that of course in terms of just content. And it which was something. I was the waiting before the the south of the event. Any any side that depends on.
00:56:56.370 –> 00:57:12.030
Christian Oliveira: Add some on things like that. We’ll. We’ll have travel with these, because, for example, newspapers I used to work on newspaper. They just wanted the page views. They don’t care if it’s. If the information they publish is a good enough, and not great enough only
00:57:12.420 –> 00:57:30.780
Christian Oliveira: part of it. But if they do get some way of getting like in double the traffic without losing for us. They will. They will do it because they they are money because of ads, and they don’t care if the person goes on the side and reach the entire news, or only they, they don’t care about that because the model, the
00:57:30.870 –> 00:57:42.190
Christian Oliveira: the business model is is to to to get pay you. So if this kind of thing goes mainstream and people because a lot of people don’t care about the news, they they just go and read the headlines and things like that, so
00:57:42.250 –> 00:57:48.040
Christian Oliveira: that people, if they, for example, change that behavior, and you just chat with Chimp
00:57:48.430 –> 00:58:04.960
Christian Oliveira: or any other, of course. Besides, we lose traffic, and we will lose money, and that will be a problem for them. And I I think those are the ones that will suffer the most because they they already suffering. Now, like new sites, for example, they are already suffering nowadays.
00:58:04.960 –> 00:58:13.130
Christian Oliveira: and the these can be complex, and of course they will block any kind of AI, because they they
00:58:13.310 –> 00:58:16.260
Christian Oliveira: you! Do you disagree with you ever? Why?
00:58:16.790 –> 00:58:22.440
Pedro Dias: Because no one blocked Google with the features snippets.
00:58:22.450 –> 00:58:36.280
Pedro Dias: because they depend on it it. But they depend on it for getting the traffic. If if it If the Board doesn’t send you the traffic, you don’t have any reason to keep it like nobody blocks you, because you depend on Google for the traffic. But if if Chat DVD Doesn’t send you traffic.
00:58:36.350 –> 00:58:41.230
Christian Oliveira: and why why do you want to keep it there like I don’t know.
00:58:41.420 –> 00:58:46.170
Pedro Dias: I don’t know there there is a huge discussion around this, because everybody
00:58:46.210 –> 00:58:58.630
Pedro Dias: started to under, I mean, who gave permission for to use the available content to do the learning because they need the corpus for the learning right.
00:58:58.810 –> 00:59:10.890
Pedro Dias: I I don’t remember if they use my website or me giving permission for them to use it. And, moreover. the the that is currently there is no way to block anything.
00:59:10.900 –> 00:59:13.610
Pedro Dias: If you, if you like, they can just take it.
00:59:13.650 –> 00:59:20.290
Pedro Dias: It’s online. They can just go there not to respect any kind of robots or whatever you have there.
00:59:20.320 –> 00:59:27.470
Pedro Dias: and because, you know, unless you literally block it. their their IP address, or they, you know.
00:59:27.840 –> 00:59:36.500
Pedro Dias: only on your on your server. But there’s no way to to tell that you are blocking them, I mean they they! They just took everything.
00:59:36.560 –> 00:59:41.280
Pedro Dias: and there is a huge discussion around now, people even calling
00:59:41.360 –> 00:59:51.220
Pedro Dias: the you know, the you into and the privacy G in Gdpr and the privacy regulators into intervening because
00:59:52.220 –> 00:59:56.390
Pedro Dias: they are seeing that this will turn into something even worse than Google.
00:59:56.810 –> 01:00:06.070
Pedro Dias: because it’s like you. You’re taking Google and taking Facebook and everything that happened, and meshing the 2 together. And now you have a model that
01:00:06.380 –> 01:00:17.890
Pedro Dias: that knows everything. And and and imagine when this model starts to learn about people first, because you go now and you ask a question about someone. If it’s not someone that is.
01:00:17.900 –> 01:00:30.840
Pedro Dias: you know, on the on search results. I is like in the knowledge panel or anything. It will say that it doesn’t know much about the person, but once it starts to learn in a unsupervised way about everyone
01:00:30.970 –> 01:00:38.760
Pedro Dias: starts to learn about Yoga starts to learn about Fernando, and and whatever. Then, at some point.
01:00:38.870 –> 01:00:45.380
Pedro Dias: You will ask it something, and it will know some something about your going, something something about anyone here.
01:00:45.730 –> 01:00:50.020
Pedro Dias: and it’s gonna become if you let it just run wild
01:00:50.400 –> 01:00:54.850
Pedro Dias: without some kind of you know way of blocking this kind of
01:00:55.230 –> 01:01:02.910
Pedro Dias: things. It’s it’s just gonna go wild. It’s just gonna go like it’s just gonna turn into a huge headache.
01:01:03.430 –> 01:01:06.200
and we are gonna miss the days of Facebook
01:01:09.930 –> 01:01:15.520
Christian Oliveira: Totally totally. I I completely agree. I I think.
01:01:16.600 –> 01:01:19.040
Christian Oliveira: Also, we are like in the beginning. So
01:01:19.050 –> 01:01:29.560
Christian Oliveira: they just release the beast. And now we are all like seeing, like the potential impact that that that this may have. And of course.
01:01:29.930 –> 01:01:38.480
Christian Oliveira: as Google, also because Google also when it started. Now also, did they didn’t ask for permission to crawl the the web.
01:01:38.630 –> 01:01:44.400
Christian Oliveira: I think, or I I I don’t know if in the beginning was different, but I suppose that they just started to grow, and
01:01:44.760 –> 01:01:47.590
Christian Oliveira: I think the the thing with the AI is that the
01:01:50.130 –> 01:01:53.210
Christian Oliveira: if if we go more metaphorical, it’s like
01:01:53.240 –> 01:01:59.430
Christian Oliveira: a superhuman that gets all the information of the world. So it’s
01:01:59.700 –> 01:02:18.120
Christian Oliveira: it. The I think people who work with, I will say, okay, so you cannot block me from going and read the news, and then sharing that information I learned with another person. So it will be very difficult to control that, because even if we block it.
01:02:18.130 –> 01:02:31.610
Christian Oliveira: they can find ways around, or another side can copy you and put information on that I don’t know. It will be very, very, very difficult to have clear limits on on that sense.
01:02:32.090 –> 01:02:42.190
Christian Oliveira: The only option will be like legal, like the but that that’s the slow way that never works, because when you get there it’s already a problem. And like
01:02:42.370 –> 01:02:45.120
Christian Oliveira: with Google, you also have, like these problems of
01:02:45.190 –> 01:02:52.900
Christian Oliveira: I don’t know that people that want to use the the to get something offline, but that takes a lot of time. It’s not like something immediate. So
01:02:52.930 –> 01:02:54.270
Christian Oliveira: I think this will
01:02:54.360 –> 01:03:03.170
Christian Oliveira: the last. Like all technologies, this will be a powerful tool for good things. It will be amazing for bad things. It will be hard, and
01:03:03.170 –> 01:03:21.570
Christian Oliveira: we will have to adopt, I suppose, because if people start using it, for we will have to make new regulations, and that will be slow. But of course a lot of bad things will happen with this. I I I I I i’m completely sure about that. And in SEO that you have a long record of people that does everything they can to
01:03:21.570 –> 01:03:27.280
Christian Oliveira: again. This is about that. It will be worse than that. So
01:03:27.350 –> 01:03:33.740
Christian Oliveira: we prepare. Yeah, the the one of the first images that I got in my mind was the
01:03:33.990 –> 01:03:51.510
Fernando Morgado: the the tool that was feeding from content, created by that another tool like creating like an infinite loop of content creation. So yeah, so it will be a for sure. So
01:03:52.030 –> 01:03:55.840
Diogo: well, what what if Google starts sharing?
01:03:56.150 –> 01:04:02.050
Diogo: Add adds revenue with the content. Creators
01:04:02.550 –> 01:04:09.100
Diogo: like it. Isn’t that what the social media networks are doing more and more
01:04:09.110 –> 01:04:28.690
Diogo: right. So in in terms of because that’s that’s the way they they manage their content. Right? So that’s the other way. They incentivize people to create content. What if, for example, Google or being starts whenever your website or information of your website is mentioned, which i’m. Not sure it can be tracked
01:04:28.900 –> 01:04:36.310
Diogo: inside these models. But if it is like, for example, we have the sources in Bing.
01:04:36.450 –> 01:04:55.760
Diogo: What if they start sharing also? Let me add this, because this is this was one of the topics of the of Microsoft, Ceos a CEO, which was exactly also their question of how they’re going to keep the system working and all this environment.
01:04:55.800 –> 01:04:57.450
Diogo: So it’s on their mind.
01:04:58.280 –> 01:05:03.420
Pedro Dias: Another thing that I just remember is. how do you remove information from me?
01:05:03.720 –> 01:05:08.740
Pedro Dias: So 1 point it learns Once you it learns about something.
01:05:08.980 –> 01:05:13.890
Pedro Dias: it it’s not just that it’s all the ramifications around the subject right?
01:05:14.160 –> 01:05:20.020
Pedro Dias: So once you’re removed, want to remove something, does it Only remove
01:05:20.160 –> 01:05:20.920
Pedro Dias: that
01:05:21.190 –> 01:05:27.430
Pedro Dias: the you know the central piece of information. What about the surrounding information? What about
01:05:27.480 –> 01:05:28.980
if yoga wants?
01:05:29.230 –> 01:05:40.430
Pedro Dias: I don’t know. I don’t want people to know that i’m bald this says, and you, you know. you know. Tell tell that. Tell they I to remove that, and then I go and ask.
01:05:40.430 –> 01:05:59.190
Pedro Dias: Oh, tell me if you’re with Bob. And there, I said I, I cannot tell you that. Why, and that I go, did go go ask you to remove it, and that the I Yes, goes. Yes. Well, why? Because he’s
01:05:59.210 –> 01:06:03.240
Ana Verissimo: if you put up 10 website saying that, yeah, you you always both.
01:06:03.720 –> 01:06:06.980
Pedro Dias: But but the thing is like with with with Google. You have
01:06:07.020 –> 01:06:08.700
Pedro Dias: physical assets
01:06:09.070 –> 01:06:15.300
Pedro Dias: that are some, some some somehow physical assets that you can point to
01:06:15.360 –> 01:06:23.670
Pedro Dias: with with an AI, you don’t, have. You have to to figure out where the information is, you know, propagated
01:06:23.840 –> 01:06:25.770
Pedro Dias: in terms to retract it.
01:06:26.570 –> 01:06:30.070
Pedro Dias: And that’s really hard to do with kind of
01:06:30.090 –> 01:06:32.660
unsupervised learning. Because if you
01:06:32.740 –> 01:06:45.960
Pedro Dias: think of about machine learning or it’s the the the purpose is that the machine learns on itself. and then you would have to kind of write some programs. So the machine tells you what it learned and how.
01:06:47.960 –> 01:06:58.540
Pedro Dias: and then you’d have to go with kind of my in black on the machine and kind of. Say, no. You forget about anything that you saw, or that you learn what a reference!
01:07:00.410 –> 01:07:03.370
Ana Verissimo: That’s a very good point.
01:07:03.410 –> 01:07:13.590
Diogo: but a and do you think this is this is something a model that would maybe work like this create content, and i’ll give you money for the content
01:07:13.640 –> 01:07:15.930
Diogo: because it does work on social media
01:07:16.190 –> 01:07:35.120
Diogo: right? And and it is a little bit how they are working with the newspapers. If i’m not mistaken, they do pay for like. I don’t know if it’s a monthly fee or what it is, it’s not even on, based on on on content, on on the amount of content, or if the content is clicked is sort of like an estimate of something.
01:07:35.910 –> 01:07:41.670
Pedro Dias: We are already giving them money right? At least we are already giving them money.
01:07:41.950 –> 01:08:00.020
Pedro Dias: I’m already paying for jet gppyt plus, and you know I don’t think that you are using being for free, so like at least you are. It’s having like add impressions in there. So you’re paying for it somehow.
01:08:00.080 –> 01:08:00.870
So
01:08:02.480 –> 01:08:03.250
Christian Oliveira: yeah.
01:08:03.860 –> 01:08:11.400
Christian Oliveira: yeah, I think Well, they will probably integrate
01:08:11.510 –> 01:08:23.050
Christian Oliveira: for for the publisher thing that they are paying that you were mentioned. They will. That’s like 0 point, 0 0 0. Wherever percent of sites it’s like a very specific theme for a very specific companies
01:08:23.109 –> 01:08:32.939
Christian Oliveira: which have power. It’s not like something on this massive that will affect the the whole ecosystem. So I I I find I find it difficult that
01:08:33.430 –> 01:08:36.850
Christian Oliveira: they will do that. And also I it’s.
01:08:37.319 –> 01:08:53.510
Christian Oliveira: The problem is that you will depend on the platform a lot more like, because right now you depend on the platform to get traffic, but then it’s your side, and you you put your rules, you you your staff here. You will just like, create the content and hope for the best in terms of
01:08:53.510 –> 01:08:57.370
Christian Oliveira: that they select your content, and that you can throw the number of
01:08:57.390 –> 01:09:02.790
Christian Oliveira: times they are saying you you appear, or whatever things like that. Also.
01:09:02.939 –> 01:09:12.700
Christian Oliveira: Another topic I had here like that. I think it’s interesting is like we’ll. We will we be able to spun the model like.
01:09:12.700 –> 01:09:30.160
Christian Oliveira: Imagine if we start creating like millions of web pages, stating one fact or one information, or or connecting whatever was we we might might want, and that can, potentially, because that’s how it works affect.
01:09:30.270 –> 01:09:49.590
Christian Oliveira: The answer is that the the the the bought give to the user so if this, if this happens because it will happen like it happens with Google, like people spun and create wherever they can, in order to try to gain the system. So imagine that we?
01:09:49.590 –> 01:09:53.390
Christian Oliveira: Because if we have the power of AI to create content, we can also use it
01:09:53.470 –> 01:10:03.570
Christian Oliveira: against the the the system. So we can start using AI to start generating size that mentioned. I don’t know wherever towards, and whenever someone asked about that.
01:10:03.720 –> 01:10:14.680
Christian Oliveira: the model has so much probability of getting of seeing those 2 words appearing one after another, that they will pick it so. It will be also like
01:10:14.770 –> 01:10:29.360
Christian Oliveira: complex, I think, in in terms in rec recursively like, because they’ve. It’s the they get it doesn’t get new content. The the the version we have now, but being that and all of the probably the new ones will
01:10:29.360 –> 01:10:44.320
Christian Oliveira: ritual limitate themselves with new content. So we can start doing things like that like trying to influence the chat like trying to convince him that I don’t know if you ask who is the President of the U.S.A.
01:10:44.480 –> 01:10:53.070
Christian Oliveira: He, he doesn’t answer the present. But another thing which which happened with Google. I don’t know, if you remember the Google bomb theme that
01:10:53.410 –> 01:10:55.030
Christian Oliveira: the things like that. So
01:10:55.230 –> 01:11:02.610
Christian Oliveira: I think there are a lot of things like this that are still potential issues, and we don’t Really.
01:11:02.990 –> 01:11:16.650
Christian Oliveira: we, and also the people that create the system. They they don’t know what probably will happen or under what’s it process because they don’t really can predict what the machine will answer. They just to train the model and try to adjust it
01:11:17.110 –> 01:11:19.780
Christian Oliveira: with the parameters. They can. But
01:11:20.020 –> 01:11:22.480
Christian Oliveira: I think we will see a lot of things like that.
01:11:22.950 –> 01:11:30.200
Pedro Dias: I I think, with this systems it’s gonna be a bit more acceptable than that. It’s gonna be a bit more more than that than
01:11:30.320 –> 01:11:31.950
Pedro Dias: it’s going to be more about
01:11:32.340 –> 01:11:36.420
Pedro Dias: manipulation. Kind of it. Look that it’s real
01:11:36.550 –> 01:11:47.570
Pedro Dias: like passing false information passing. You know, by this information you’ll try to buy as the the AI. Do you remember that initial Microsoft
01:11:47.840 –> 01:11:50.970
Pedro Dias: test that was done like in 2,000,
01:11:51.230 –> 01:11:55.940
Pedro Dias: 14, or 15, that they launched this AI, and it
01:11:56.430 –> 01:12:05.590
Pedro Dias: People turned into it, turned it into a Nazi. Do you remember it? So? It was not so much about, you know, making it kind of
01:12:05.610 –> 01:12:07.200
speed span.
01:12:07.760 –> 01:12:18.260
Pedro Dias: It was about like biasing with with content. That, you know way kind of, you know, makes it look like almost credible, but
01:12:18.300 –> 01:12:26.580
Pedro Dias: you know it’s it’s not because thrush is like it. It trash, spam. What you are using in search is is probably not gonna
01:12:26.610 –> 01:12:28.750
Pedro Dias: work the same way
01:12:28.860 –> 01:12:31.180
Pedro Dias: in these systems.
01:12:31.200 –> 01:12:49.980
Pedro Dias: But I i’d say, like the day the real days are going to be on the on the biasing information that you’re gonna see from bad actors, and that’s what you are starting to deal with, if you that’s why Microsoft, like retracted being, though it after they launched it in a in a more like free, free form.
01:12:49.980 –> 01:12:58.420
Pedro Dias: Now they kind of you know you. You last like 2 things, and it kind of splits the 15, and it says I cannot say anything more, because
01:12:58.480 –> 01:13:06.450
Frederico Carvalho: let me show you one thing that I have listened. Christian. I I don’t know if you.
01:13:06.700 –> 01:13:26.400
Frederico Carvalho: the a. And of you guys have tried to put your name in this and try to ask Cbt. What do you think about Fredrika Vio, who is, if they could buy. So if in most of the times he knows something because he will search. But he is not complete. And when one of the things that
01:13:26.530 –> 01:13:42.570
Frederico Carvalho: I have tried. I have put my bio and I have and say, okay, when you memorize. And he memorized, and I put and and I asked Ji Shaj bt to. He writes the video 1 2 3 times.
01:13:42.610 –> 01:13:46.900
Frederico Carvalho: and I use other categories that is not less
01:13:46.910 –> 01:13:54.660
Frederico Carvalho: for my my girlfriend, and he has memorized while I have told them to.
01:13:54.670 –> 01:14:03.430
Frederico Carvalho: So the one thing that Kristen was saying in the pendulum also saying that maybe she, or as the part of
01:14:03.490 –> 01:14:18.430
Frederico Carvalho: puts false information because someone introduced a information like Wikipedia inside the, and he assumes that that information is correct until someone corrects it again makes sense.
01:14:18.770 –> 01:14:35.430
Diogo: So I I don’t think the however, I don’t think the model works like that. Whatever you do on on your chat. Doesn’t influence other people’s chat unless it’s. I know I know your experience. Yeah, I know. But supposedly it doesn’t work like that.
01:14:42.430 –> 01:14:48.610
Christian Oliveira: Yeah. But you, even if it’s not like that, Imagine I don’t know you are Samsung.
01:14:48.630 –> 01:15:04.660
Christian Oliveira: And do you want to start convincing that the last iphone is worse than your that’s so. As the as the information they gave, they give it’s not based on experience. It’s not based on
01:15:04.680 –> 01:15:24.250
Christian Oliveira: analysis of the topic. It’s not like that chat. Gbd receives the question, and says, we do as which which is a better from this or this is not like they think about. They compare and say, oh, so this has more. Memory is not like they do that they it’s as better as mentioning is yes, to how to complete, so they will use information they find.
01:15:24.540 –> 01:15:38.140
Christian Oliveira: or they found before, to create a sentence based on the on the input. So you could potentially spam because you, I don’t know if it will be more easy or more difficult. But imagine that. But imagine that
01:15:38.140 –> 01:15:46.670
Christian Oliveira: because this happens nowadays with news like we, we, we see people creating new sites to to a modify beliefs and
01:15:46.670 –> 01:16:00.490
Christian Oliveira: distribute the fake news and all of that. So imagine that on a scale like not not like one side creating one piece of you, Christian. What what is stopping Samsung from doing that right now to
01:16:00.620 –> 01:16:09.940
Christian Oliveira: Google. And there is no one answer. There is no one answer to this question. You get a list of sites that you can trust and not trust those sites.
01:16:09.940 –> 01:16:28.180
Christian Oliveira: And those sites might be influenced, but but which one is on on the first position or the first 10 positions. Yeah. But you can as a user you have. You see who is behind. That’s what I mean on Chatgbt. Do you get an answer and you believe it or not, and it’s very dangerous for that. And
01:16:28.180 –> 01:16:38.240
Christian Oliveira: it hasn’t much more power, because it’s not like a piece of it’s like something that talks to you as a person that you can trust. It’s not like a search engine that
01:16:38.240 –> 01:16:49.160
Christian Oliveira: if the resulting is like I I if if it’s not good, don’t ask me. I just found this on the Internet wherever you you don’t blame Google so much for that, because Google is just
01:16:49.160 –> 01:16:58.460
Christian Oliveira: I start changing, and that’s all. But this is like in that. It’s intelligence like that’s the name of it. It’s like artificial intelligence. So people, if people start using it and trust it.
01:16:58.590 –> 01:17:28.550
Christian Oliveira: it will be very, very dangerous. That’s kind of sense, and and and if it’s dangerous, of course people will do any kind of things to influence it, because if you can influence it, it will be. You will have a massive impact in sales if it’s mainstream like. Imagine that people go. But that’s the same with Google, that same with Google, with with even with the featured snippet you can. You can do spam around it. You can replace the featured snippet. No, but features need it sound like a small thing like I I I mean like, for
01:17:28.980 –> 01:17:46.900
Christian Oliveira: the important queries like you buy something like something that it’s like involves money, and I I I I suppose that I don’t how tall, how tall is the how tall is the if a tower. Okay, maybe you can spam that, but that’s not a big thing. The big thing is that
01:17:46.900 –> 01:17:55.900
Christian Oliveira: you convince the AI that it recommends all your bro, that not the the other one, for example. And that’s
01:17:56.030 –> 01:18:14.460
Christian Oliveira: that’s complex, because, as as we are seeing, they will talk with the language that sounds completely through it’s like you. Do you read anything created by jet chat? DVD: and it’s like, Wow, this sounds perfect. I cannot not trust you because it sounds like that. And
01:18:14.640 –> 01:18:22.520
Christian Oliveira: so I think a lot of these will be, will have. This will have to adapt to those behaviors because people will do it. People will create websites to fun.
01:18:22.640 –> 01:18:28.900
Christian Oliveira: If it’s possible they they will do whatever they can to to try to influence that. I think at least.
01:18:28.950 –> 01:18:39.520
Diogo: Okay, since we’re we’re, we’re reaching our time. Let’s do a round up. and i’m gonna ask like everyone that’s still here.
01:18:39.620 –> 01:18:43.100
Diogo: including you, Glenn.
01:18:43.260 –> 01:18:50.380
Diogo: and what? What? What’s, what’s the future Look like? Is that a good or whatever you want it? What you want to say about this.
01:18:50.400 –> 01:18:52.840
Diogo: if you take in.
01:18:52.930 –> 01:18:59.100
Diogo: or you have any idea currently on your mind about this, Anna. Do you want to start
01:19:02.740 –> 01:19:10.840
Diogo: 1 s I got that even with your my
01:19:10.860 –> 01:19:15.330
Ana Verissimo: I was very disappointed because I was asking you why the founders of this one is your meet up.
01:19:15.480 –> 01:19:20.300
Ana Verissimo: And Christian Fernando. We were completely ignored.
01:19:20.700 –> 01:19:32.440
Ana Verissimo: So he was telling me that the founders of this meet up at Google. So the only one, and then wait. It was Yoga noon with poly to and no no matter.
01:19:33.000 –> 01:19:38.250
Ana Verissimo: These are the 3 organizers, founders and current organizers of this one is so
01:19:38.410 –> 01:19:39.230
Diogo: funny
01:19:39.450 –> 01:19:49.100
Diogo: It’s it’s paying you back because you use it so much.
01:19:49.270 –> 01:20:09.220
Diogo: He already is bombing. He’s already. There’s actually a theory where the AI is going to turn on everyone that was against it. It has even a specific name. I can’t remember the name right now, but it’s so that’s funny. But, Anna, what do you take from this today? And what what are your feelings?
01:20:09.790 –> 01:20:19.310
Ana Verissimo: They haven’t changed much in this like compared to when I I join. I’m still not really buying into the whole craziness around it. And I think.
01:20:19.870 –> 01:20:27.280
Ana Verissimo: Yeah, it’s gonna it’s gonna of course, impact search. But I don’t think it’s gonna completely change everything from one day to another. Replace
01:20:27.530 –> 01:20:33.580
Ana Verissimo: Google and replace writers in the Ceos and a bunch of you know, in stack or flow and replace everything.
01:20:33.900 –> 01:20:41.870
Ana Verissimo: I think it’s gonna be more nuanced. I also think that at this point, and especially hearing you that use it more than me, for example.
01:20:42.120 –> 01:20:47.920
Ana Verissimo: it sounds a bit niche at this point, as in it’s very useful in certain areas.
01:20:48.140 –> 01:20:54.840
Ana Verissimo: but the majority of people don’t go to Google to search for code or data, you know, or how to, you know, improve their like, you know.
01:20:54.850 –> 01:21:02.520
Ana Verissimo: excel usage. So, majority of the people that’s not what they’re gonna do like it’s not part of the search on Google. So i’m
01:21:02.780 –> 01:21:10.200
Ana Verissimo: It’s also Unclear to me if it’s going to be a niche thing, or if it actually it’s going to involve into something more broad.
01:21:12.710 –> 01:21:18.200
Ana Verissimo: Yeah, this is, make it. But also I I Of course I I I think
01:21:18.560 –> 01:21:26.520
Ana Verissimo: it’s real impact search. I don’t think it’s not going to impact. I’m not saying, you know doesn’t exist. I don’t think it’s a
01:21:27.590 –> 01:21:30.080
Ana Verissimo: It’s going to change everything from Monday to another.
01:21:31.010 –> 01:22:00.100
Diogo: Okay, okay, you are an interesting. It’s interesting to see how you’re using it as well. You honestly. Yeah, that that’s something. Oh, that’s a very good point, because it’s something that it’s. It’s definitely. It’s a bridge that you start creating right. A lot of people start talking when you talk about it like you. You start thinking of all the bridges like everything that you can do with it right? And whenever someone shows you what they’re doing, it sort of creates this bridge like. Oh, hold on! Maybe I can do this as well, you know, and that’s very important.
01:22:00.110 –> 01:22:01.810
Diogo: Pedro. Do you want to go next.
01:22:02.930 –> 01:22:05.930
Pedro Dias: sure. Well, I think
01:22:06.170 –> 01:22:09.250
Pedro Dias: to be honest, I’ve been thinking about this, and I think
01:22:09.500 –> 01:22:13.900
Pedro Dias: search is going to be the least of our problems regarding this, because
01:22:16.710 –> 01:22:23.330
Pedro Dias: the the the potential that a tool like this has to disrupt a lot of things is really high.
01:22:23.500 –> 01:22:29.980
Pedro Dias: And if you think that we are, we are just dealing with with something that
01:22:31.800 –> 01:22:44.690
Pedro Dias: when when you compare it to the scale of what’s in the pipelines, so, for example, this model that we are dealing with, As I was saying a while ago, it’s trained on a little 75 million kind of terms.
01:22:45.310 –> 01:22:51.720
Pedro Dias: And if you think the next one that Google it Hasn’t released, yet, which is based on the Lambda.
01:22:53.090 –> 01:22:58.500
Pedro Dias: is trained on 500 and something 1 million by 140 million
01:22:58.780 –> 01:22:59.720
So it’s like
01:23:00.190 –> 01:23:05.950
Pedro Dias: 3 times more, and than than this one that we are dealing with around 4 or 4 times more almost.
01:23:06.590 –> 01:23:10.350
Pedro Dias: And the thing is.
01:23:11.560 –> 01:23:26.110
Pedro Dias: if you remember, in the old days. the way that we learn to deal with search engines, and the way that we learn to deal with. Oh, this is the useful information that it can give me. And then we started discovering how people can do better things with it as well.
01:23:26.820 –> 01:23:39.320
Pedro Dias: And we are gonna see this in a much more exponential way. because now it’s not only search that is going to disrupt is everything else.
01:23:39.570 –> 01:23:46.090
Pedro Dias: but it’s gonna be exploited on the good and the bad side as well.
01:23:46.130 –> 01:23:49.910
Pedro Dias: and I think we are all very naive
01:23:49.990 –> 01:23:54.400
Pedro Dias: still to deal with something like this until we learn
01:23:55.000 –> 01:24:06.250
Pedro Dias: to look for what to look out and what to be careful, for we are going to have to endure a lot of a lot of bumps on the road.
01:24:07.560 –> 01:24:08.550
and
01:24:08.560 –> 01:24:10.370
Pedro Dias: so i’m kind of
01:24:10.530 –> 01:24:17.010
excited on one side like what the time to be alive. But on the other side
01:24:17.170 –> 01:24:36.550
Pedro Dias: my kind of I don’t know I kind of side is kind of raining some bells because what I’ve seen your yeah, what I’ve seen people doing in in in kind of when I was done fighting at Google, and if I kind of port it to kind of what is possible to do with something like this.
01:24:36.560 –> 01:24:38.410
Pedro Dias: It can
01:24:38.690 –> 01:24:42.570
Pedro Dias: be really kind of scary. I mean.
01:24:42.790 –> 01:24:48.900
Pedro Dias: if you know, in Google you had the power to strike on of some people’s lives.
01:24:49.110 –> 01:24:52.480
Pedro Dias: Not now. Imagine this applied to a model. That kind of you know.
01:24:52.620 –> 01:24:55.460
Pedro Dias: permeates a lot of stuff.
01:24:55.480 –> 01:25:08.250
Pedro Dias: not not only like online stuff or because it’s gonna be present to your software in your TV, in your appliances, in your whatever. So
01:25:09.550 –> 01:25:19.540
Pedro Dias: I think there’s a lot that we are going to be learning and all that we need, for I think what we need to to learn first is, learn to what to be careful with.
01:25:19.980 –> 01:25:28.000
Pedro Dias: and you know the things that we need to look at, and how this can be exploited, because not even Microsoft or
01:25:28.100 –> 01:25:44.080
Pedro Dias: I. I think Google Hasn’t released it because they they are not comfortable in the ways that they think that this can be exploited So and and and we we saw this with Microsoft right? We saw this. Microsoft went away and release the tile. Here it is. We are not afraid.
01:25:44.150 –> 01:25:45.720
Pedro Dias: and now they kind of.
01:25:46.070 –> 01:25:52.460
Pedro Dias: you know, lock down to a point that you cannot do more than 2 prompts without it, kind of you know, retracting.
01:25:53.940 –> 01:25:56.840
Pedro Dias: and this shows that they are not confident at all with.
01:25:57.090 –> 01:26:02.480
Pedro Dias: You know how they control this, because social engineering is
01:26:02.530 –> 01:26:03.890
way
01:26:04.030 –> 01:26:05.160
Pedro Dias: more
01:26:06.630 –> 01:26:12.960
Pedro Dias: dangerous because it’s not in your face. It. It’s sneaky.
01:26:13.310 –> 01:26:19.530
Pedro Dias: You know what you mean, what I mean. It’s it’s a deceiving it’s a sneaking and deceiving way
01:26:19.760 –> 01:26:21.000
Pedro Dias: that you are not.
01:26:21.020 –> 01:26:24.150
You’re almost not aware of that you are being deceived.
01:26:25.540 –> 01:26:35.870
Pedro Dias: So this the for me, this has the dangers of of all of this and this up, applied at this scale with the potential that he says for me, this is very kind of
01:26:36.120 –> 01:26:41.290
Pedro Dias: dangerous. And let’s see what happens when Google releases their their their version.
01:26:41.340 –> 01:26:43.890
Pedro Dias: and what one they release.
01:26:44.410 –> 01:26:54.230
Pedro Dias: So yeah, that’s my that’s my sorry for the Doomsday scenario, but it’s not. I. I think I think it’s kind of you know.
01:26:54.360 –> 01:27:05.830
Pedro Dias: with the the the cliche kind of thing you like. With great power comes greater responsibility. and I think we are not fully aware of the power that these things have
01:27:06.830 –> 01:27:09.780
Diogo: good point. Fernando want to go next.
01:27:10.880 –> 01:27:20.410
Fernando Morgado: So so i’m scared of. Do you want a blanket? I think it’s it’s commercial, and we are not prepared.
01:27:20.620 –> 01:27:27.340
Fernando Morgado: and also it’s a set of tools that we will include on our daily usage.
01:27:27.600 –> 01:27:42.170
Fernando Morgado: And also, I think people are smart, and they’ll learn to use it the best way for themselves. And also people are lazy, so they will set up like the best basic setup, and like, go along with it
01:27:43.700 –> 01:27:57.780
Fernando Morgado: for me it. It will be amazing if someone can learn with the help of AI on their side. But it’s scary me also that we are outsourcing our great power, you know, and because
01:27:57.950 –> 01:27:59.980
Fernando Morgado: going back to SEO like.
01:28:00.840 –> 01:28:11.520
Fernando Morgado: there’s most of the time that i’m researching for keywords for top, and and also i’m also learning if I will outsource that phase to a tool.
01:28:11.600 –> 01:28:16.430
Fernando Morgado: I’m. Also not learning in not evolving. So I think. Yeah.
01:28:16.510 –> 01:28:23.250
Fernando Morgado: I think it will be amazing if we can use it, and also it will be very scary to use it.
01:28:24.940 –> 01:28:36.050
Christian Oliveira: Okay, Christian, want to light up the room a little bit.
01:28:36.270 –> 01:28:56.230
Christian Oliveira: I think I I agree in the sense that if if if this gets good enough to be a problem with the SEO SEO won’t be, I think we will reach a point where this AI can be to a point that people will start like
01:28:56.230 –> 01:29:25.440
Christian Oliveira: asking and trust in their answers. We will enter a very more complex e point in a humanity where these machines, these models can start like influencing people, not in the buying an iphone, or whatever like in elections in big things like determines like what’s happening in real life. And and at that point I will be like that. That’s that’s it. SEO will be the the least of our problems.
01:29:25.690 –> 01:29:39.200
Christian Oliveira: I don’t know if you right. I did this. Well, this guy is spending for the book sapping, and he has another book which talks a bit about this about the future, and he’s paying some very black
01:29:39.200 –> 01:29:49.820
Christian Oliveira: a future also in terms of If AI becomes that intelligent and that trustable people will start delegating a lot of things in it, and it will
01:29:49.920 –> 01:29:59.160
Christian Oliveira: generate like this. It’s like that, if it’s. Imagine that you start asking questions like, Who should I both?
01:29:59.230 –> 01:30:05.130
Christian Oliveira: And and that that answer is very dangerous, because
01:30:05.260 –> 01:30:17.930
Christian Oliveira: it can be something reasonable like not leaning to anything and just explaining, like, I don’t know the programs of the parties, but they can also give a straight answer, like I. I think you should both
01:30:17.930 –> 01:30:46.660
Christian Oliveira: that because of that information I have from you, which is another thing we haven’t talked about like. Imagine that it starts like a recollecting information from you from your questions and all that, so that will be really dangerous. And at that point I I don’t think we will be having problems with the CEO will have him problems with everything else. So yeah, I think we are like in the beginning. So this is very, very, very, very, very small. What we are seeing is just it’s very, bad in terms of
01:30:46.860 –> 01:30:56.390
Christian Oliveira: the potential it has. But the problem is that it’s it’s very. It’s much better from what we have in mind for some I didn’t imagine that I will have this
01:30:56.390 –> 01:31:15.630
Christian Oliveira: this year like I. I didn’t in mind that this was going to be possible this year, even if all the mistakes. So that’s May. That makes me not having good thoughts about the future. I’m sorry about that. This is very like Black Muir meets Chat Gpt. That’s what I:
01:31:16.390 –> 01:31:19.010
Christian Oliveira: Yeah, that’s a good point.
01:31:19.090 –> 01:31:26.820
Christian Oliveira: The problem is the input, which is humans.
01:31:26.860 –> 01:31:31.840
Fernando Morgado: Also, I think we’ll start following the
01:31:32.120 –> 01:31:45.240
Fernando Morgado: the importance of a good journalism and the all the ethnic that goes along with journalism. So but before that we will suffer for sure.
01:31:45.420 –> 01:31:46.640
Fernando Morgado: Okay.
01:31:47.070 –> 01:31:48.480
Diogo: Yeah, yeah.
01:31:48.680 –> 01:31:50.690
Diogo: okay, I’ll, I’ll share my view.
01:31:50.780 –> 01:31:58.830
Diogo: and it’s not gonna be unicorns as well.
01:31:58.900 –> 01:32:05.220
Diogo: Sorry. No, the the
01:32:05.370 –> 01:32:15.090
Diogo: I am. Actually, I’ve been talking about this and a a Christian know, and I think everyone knows in the in the in our also Whatsapp. I’ve been very worried about this
01:32:15.150 –> 01:32:22.820
Diogo: actually specifically what’s coming in the future. I don’t know what’s coming. I don’t know in terms of
01:32:22.820 –> 01:32:39.000
Diogo: how it’s going to involve in terms of society. I think it’s gonna have like a very low curve where, like a a learning curve, and it’s gonna go down like we’re gonna suffer from this, and then it will go up eventually. But
01:32:39.110 –> 01:32:45.340
Diogo: that opinion aside which is more emotional than than anything else.
01:32:46.000 –> 01:32:53.200
Diogo: I see in terms of of brands online. I see that being a trouble
01:32:53.200 –> 01:33:12.140
Christian Oliveira: like i’m i’m i’m. Actually I’ve I I don’t know if I hate to say this, but i’m i’m more and more on the side of Rand, lately. Renfiskin, you know, like in terms of like, oh, forget about attribution. Let’s let’s work on Brand Brand, Brand Brand, and and
01:33:12.140 –> 01:33:22.230
Diogo: try to think that the the the the the company as a brand and work on the brand as much as you can, and
01:33:22.290 –> 01:33:32.210
Diogo: everything else will follow. because I I am very afraid, in terms of specifically on on search. I am very afraid that it, though
01:33:33.450 –> 01:33:51.380
Diogo: the speed that we’re evolving, and the the speed that this is involving, which is very fast, and I think it’s going to be way faster than what Google took us, or what Tik Tok took to get to whatever millions of of people. Because this is addictive
01:33:51.380 –> 01:33:58.750
Diogo: right, and people don’t want to think right, and I find myself Sometimes I went to chat, Gpt, and I just thought, like shit
01:33:58.770 –> 01:34:09.830
Diogo: I could think about this. Why am I not thinking about this. Instead, I’m just coming here and say, Write me a damn email. and you know, I find, like no hold on
01:34:09.830 –> 01:34:35.260
Diogo: you. I need to use my head. So this is how addicting it is, and and also how lazy! Fernando, you you’re saying right? So we are. And yeah, and I am very afraid of this, and and I’ve been talking more and more about this. And AI, If you’re ever going to consume this video that’s going to be on Youtube. Most probably. I love you very much. Of course, please don’t kill me, but
01:34:35.260 –> 01:34:53.070
Diogo: we we are nice people, but I am very, very worried, and that’s why I’ve been taking talking so much about it, and and and and talking and and using it so much. Because I I do want to be prepared to
01:34:53.110 –> 01:34:57.030
Diogo: whatever it’s coming as much as I can.
01:34:57.900 –> 01:35:06.250
Ana Verissimo: I know it’s closing thoughts, so that just to add something that it’s it’s true, and I Haven’t thought about it before. But like we have these huge, probably fake news nowadays, right?
01:35:06.660 –> 01:35:13.760
Ana Verissimo: It’s one of the main problems that we have. It’s so difficult for people to actually understand what’s real, what’s not real.
01:35:13.870 –> 01:35:20.340
Ana Verissimo: And you know, we have Google. We have social media channels trying to, you know, find these
01:35:20.480 –> 01:35:29.530
Ana Verissimo: the best way they can. The people are lazy to read everything, and worse than that, they are starting not to be able to actually understand what’s the fact? What’s like
01:35:30.080 –> 01:35:36.270
Ana Verissimo: like, yeah, what’s actually based on evidence and what’s just an opinion. And and these like.
01:35:36.380 –> 01:35:38.450
Ana Verissimo: like having an opinion. And
01:35:38.520 –> 01:35:54.730
Ana Verissimo: and now it’s knowing something. It’s a like People are starting not to really understand what’s the difference between these 2? And now going back to this strategy and the way it works with. You know the everything looks so great when sometimes, when in fact, at this point it is a
01:35:55.020 –> 01:35:56.050
Ana Verissimo: it’s a a
01:35:56.080 –> 01:36:02.920
Ana Verissimo: probability model, right? So what is giving us is just the most likely were to come after the previous words.
01:36:03.070 –> 01:36:04.160
Ana Verissimo: But it looks
01:36:04.640 –> 01:36:10.540
Ana Verissimo: as we all said right. It looks great from the outside. If you don’t go into details, well, sometimes even the details are right. But let’s
01:36:10.680 –> 01:36:12.800
Ana Verissimo: even sometimes people are wrong, but
01:36:13.470 –> 01:36:29.810
Ana Verissimo: but it’s it’s difficult to to know that you have to go into the specific, especially flying the hours. So how does that? Actually, maybe it will even make like fake news, you know where, like it will make it even harder for people, especially maybe younger people, to really differentiate
01:36:30.080 –> 01:36:31.440
Ana Verissimo: what’s factual.
01:36:31.450 –> 01:36:39.030
Ana Verissimo: What’s not factual, and maybe people will be even lazy because I feel like Google already makes it so easy for us. What can be easier than Google?
01:36:40.360 –> 01:36:43.650
Diogo: There’s always something in that. It can be easier, you know.
01:36:43.940 –> 01:36:49.290
Ana Verissimo: and too easy. Yeah, it can be just when it was him. Notice everyone.
01:36:49.450 –> 01:36:52.700
Ana Verissimo: It’s too easy can be, can be dangerous.
01:36:52.990 –> 01:36:59.280
Ana Verissimo: and the sensor people won’t think one they have a critical Yeah, I won’t be able to have a physical
01:36:59.300 –> 01:37:00.770
Ana Verissimo: thinking. Basically.
01:37:01.520 –> 01:37:03.770
Ana Verissimo: but it’s a very good point as well. You, I think
01:37:05.560 –> 01:37:09.250
Diogo: well on that cheery note.
01:37:11.530 –> 01:37:17.270
Ana Verissimo: I’m.
01:38:11.620 –> 01:38:15.830
Pedro Dias: So for 2 years, until.
01:38:16.810 –> 01:38:20.930
and don’t back about to make it of a sound.
01:38:55.060 –> 01:38:55.940
Ana Verissimo: it is.