CYBER WEEK Deal: 50% OffGet access to all live events, all recordings, and our support network of 4,596 members.
[0:00:00.04] I'm actually curious to know afterwards what is it about the talk that is not not for those who are here.
[0:00:08.64] Obviously you guys are interested.
[0:00:10.24] Is it?
[0:00:10.64] Is it the title?
[0:00:11.44] Is it the description or is it the topic?
[0:00:14.24] Because I'm I'm curious.
[0:00:16.8] This is this is actually the reason why I decided also to write this book.
[0:00:22.56] Philosophy for Business Leaders, Asking Questions, Navigating uncertainty and the quest for meaning.
[0:00:28.08] And it has an entire part on what we're going to be discussing today.
[0:00:32.08] But I didn't include it in the subtitle because for some reason when we say ethics, everyone just discards them as yeah, we'll just do whatever is legal, who cares?
[0:00:44.08] I don't care about ethics.
[0:00:45.64] I want something more useful, etcetera.
[0:00:47.84] So, but anyway, let's let's jump in.
[0:00:52.6] No, But what was the question, Mahmoud?
[0:00:54.04] Like what would you like to learn to know whether it's the topic, the title, or the description of the talk that was not appealing?
[0:01:04.72] OK, all right.
[0:01:05.92] But you're assuming just because not everyone who registered came, although because usually for the stoic.
[0:01:12.4] Talks.
[0:01:12.76] I see.
[0:01:13.16] There's more people, More people.
[0:01:14.88] Got it.
[0:01:15.2] Got it.
[0:01:15.6] I see.
[0:01:16.16] Yeah, Yeah.
[0:01:16.56] Yeah.
[0:01:17.08] No, not intent Sign up.
[0:01:19.48] Got it.
[0:01:20.4] To ICICI.
[0:01:21.96] Yeah.
[0:01:22.6] Yeah, yeah.
[0:01:23] Yeah.
[0:01:23.16] I think it's an interesting thing.
[0:01:26.52] Yeah.
[0:01:27.88] We'll run a survey.
[0:01:30.56] Yeah, I'm really curious because, and this is the reason why I usually don't or try to refrain from giving ethics courses as well because it's.
[0:01:40.76] Yeah, yeah, it's a it's a misnomer.
[0:01:43.28] It's whatever all sorts of stuff we we'll we'll discuss.
[0:01:47.4] Like, honestly, personally, I I I thought it will be actually more appealing.
[0:01:52.4] Like historicism is a bit more genetic, like it's just philosophy of life.
[0:01:57.44] And I assume since we're a business community mostly, that there's at least some model dilemmas that probably all people have.
[0:02:08.04] That this could be helpful with SO yeah Vlad, it's I'm not talking about attendance.
[0:02:15.84] I'm talking about registration.
[0:02:17.6] Usually for Stoicism like north of 150 people sign up for this one I think only 65 signed up.
[0:02:25.16] So it's it's interesting.
[0:02:27.12] Yeah, I can.
[0:02:27.6] I can get my $0.02 before we start giving that.
[0:02:29.48] It's small.
[0:02:29.8] Yeah, sure, sure, sure.
[0:02:31.2] I think I just read the title, So A Pragmatic Guide to Business Ethics.
[0:02:35.2] And then in the first slide you had.
[0:02:37.72] Decision making.
[0:02:39.12] I think that the keyword decision making would be more appealing potentially than ethics, which is usually more nebulous and highly associated usually with corporate America as opposed to, you know, entrepreneurship.
[0:02:53] That's just my my two cents.
[0:02:54.48] But I'm here so obviously something worked.
[0:02:57.72] Yeah.
[0:02:57.88] Yeah.
[0:02:58.24] No it's it's just again, I'm I'm curious because it's it's fascinating like I'm I'm still struggling with ethics in general.
[0:03:04.32] I tried to refrain from talking about ethics.
[0:03:07.24] Ethics courses are usually very tricky to to facilitate.
[0:03:11.84] So let's see how today goes.
[0:03:14.28] I I assume many of you already know me.
[0:03:17.84] A brief kind of bio.
[0:03:21] I am Lebanese, born and raised, got my undergrad in finance in Lebanon, came to Spain, did my master's and PhD in philosophy.
[0:03:31.56] I also spent the year studying Spanish before I joined the Master's program.
[0:03:37.6] Then I went back to Lebanon, taught for seven years at 2 universities.
[0:03:43.24] Should hit the fan at some point, so I decided to pursue other things.
[0:03:48.4] Since 2021 I've been freelancing and trying to be as cockroach as possible, trying to survive.
[0:03:57.72] Now I am back in Spain and Salamanca where I am currently a bridge dweller.
[0:04:02.8] And in addition to that I'm also now an author, even though I wrote the book before that.
[0:04:09.32] But it's this is this is different, I feel different about it.
[0:04:13.84] So yeah, if you're interested in philosophy, if you're a business leader, entrepreneur, then this book may be of interest to you.
[0:04:20.4] Do check it out.
[0:04:23.88] So what is it that we're going to be doing today?
[0:04:27.24] We're going to be starting with a thought experiment.
[0:04:29.96] Some of you might have seen this thought experiment before.
[0:04:33.04] I think you can already probably predict what this thought experiment is going to be.
[0:04:38.32] Then we're going to be trying to define ethics, then examine three main ethical frameworks that I'm going to be claiming all of.
[0:04:48.52] You somewhat use them on a daily basis.
[0:04:52.64] Maybe you're just not aware that they are ethical frameworks.
[0:04:58] Then we're going to be seeing some examples, and then I'm going to be talking about ethical decision making like this, the actual framework, identifying dilemmas, etc.
[0:05:07.88] And then we'll revisit Default Experiment and we'll recap.
[0:05:13.24] The takeaways from this talk are as follows.
[0:05:17.4] As I said, first I want to show you that you already use some of the most fundamental ethical frameworks in your day-to-day activities, not only as a professional but also in your personal life and when you're dealing with others and maybe on Twitter.
[0:05:33.56] Cuz we're all angry on Twitter and we tried to vent there, provide you with some basic tools to carry out an ethical analysis, and then enable you to identify the different assumptions and frameworks underlying these different arguments.
[0:05:51.48] And I'm going to be highlighting this shortly as well.
[0:05:55.12] Like, why do we differ on ethical issues?
[0:05:59.6] Why do we espouse different arguments?
[0:06:01.88] Where is the problem?
[0:06:02.8] Is the problem and how we conclude what we conclude?
[0:06:06] Or is the problem elsewhere?
[0:06:09.52] But before that, a preface just it.
[0:06:15.64] It occurred to me yesterday to to talk about this and then today serendipity has it.
[0:06:24.68] Jordan posted this.
[0:06:25.76] It was very interesting.
[0:06:28] Why do you think tobacco advertising was banned?
[0:06:34.24] Like I I do have the arguments there, but personally.
[0:06:38.96] I think it was probably more social pressure rather than model reasons like that tobacco became something of a taboo socially, right that shouldn't be promoted especially to kids, but probably even to adults.
[0:06:54.4] But for example, alcohol is still promoted in general, right?
[0:06:58.6] Yeah, but so, so social pressure.
[0:07:00.44] I agree.
[0:07:01.12] But then you mentioned something it it shouldn't be promoted to kids.
[0:07:04.56] Why is that?
[0:07:06.96] Well, I think everyone agreed that it was never, it wasn't controversial anymore that tobacco is harmful and addictive, especially if kids start early.
[0:07:17.08] So when you say what does the argument there that is harmful to health, right, that is that's, I think, I think almost every reasonable person agrees that kids shouldn't start smoking or shouldn't pick up smoking, right.
[0:07:32.08] Yeah, exactly.
[0:07:33.52] And so of course this is social pressure.
[0:07:35.6] But then the underlying arguments, if we break them down, harmful for health.
[0:07:41.12] This is an ethical argument because why should?
[0:07:43.84] If I want, if I personally don't care about harming myself, who cares, right?
[0:07:51.88] But then it's not only harming yourself, also harming others.
[0:07:55.2] Don't advertise the kids.
[0:07:56.76] Why?
[0:07:57.08] Because they're impressionable.
[0:07:58.28] Because they're underage, they're not autonomous yet to make their own decisions, that's an ethical argument.
[0:08:05.8] And so generally speaking, the reasons why tobacco was banned, tobacco advertising, legally speaking, underlying that there were many arguments including ethical, legal, ethical like you said autonomy, right.
[0:08:24.6] Some people are are maybe weak, they cannot really make reasonable decisions or they cannot make decisions where that that require will to power because it's addictive etcetera.
[0:08:35.72] And then there are economic and healthcare health risk as well.
[0:08:40.44] But this is also there's an ethical dimension there.
[0:08:43.92] And so the reason why I started with this is because usually when I say ethics, everyone is like, we'll do what is legal and that's it, sure, but why is something decided or agreed upon that it should be legal or illegal?
[0:08:59.88] And so, more often than not, the arguments that are put forward in order to decide whether something should be banned, whether something, something should be legalized, whether something should be illegal, are more often than not ethical.
[0:09:15.96] And it's it's it's mind blowing to me because I just came to the serialization a few days ago.
[0:09:21.44] So I started looking it up and I was like, what do like how?
[0:09:24.88] Based on what?
[0:09:25.44] So think marijuana, right.
[0:09:27.44] There are economic arguments, but then the majority of the arguments put forward for or against more often than not are legal.
[0:09:36.2] Are ethical so But anyway this is just a preface.
[0:09:39.56] Another example that is.
[0:09:41.08] More relatable by the way, I think.
[0:09:42.6] I think it's somewhat related to the problem of student debt in the United States, right, where sort of colleges promote, I mean they have the, they have the brand as being something very valuable, right, and push.
[0:09:57.76] 16 yellows and 18 yellows to take onto $100,000 in debt.
[0:10:01.44] That takes actually that's you can basically propose that argument as well the the, the, the.
[0:10:08.68] I mean I see both arguments on the one hand that yeah they're right we'll we'll get into a never ending debate should it be free, should it not be free?
[0:10:17.36] Should they be allowed to take that.
[0:10:19] But you know, like all these debates that we have on a daily basis on on Twitter and elsewhere, they really by and large are influenced or informed also or have an ethical dimension.
[0:10:34.4] Yeah, yeah, we'll, we'll, we'll get to that.
[0:10:36.68] We'll get to that.
[0:10:38.08] Another example is, is this, which I saw on Twitter.
[0:10:42.96] If you Google or if you search unethical on Twitter, you have all sorts of interesting kind of tweets what people consider unethical.
[0:10:51] You know, plagiarism, lying, being not being transparent.
[0:10:56] In this case I'm I'm curious as to what you guys think.
[0:10:58.64] There seems to be a quite unethical social proof doing the rounds in the indie world, but it's being accepted.
[0:11:05.92] Common one being you create the launch post on social platform and display that social platforms logo on your SaaS implying some sort of affiliation.
[0:11:16.36] And then the arguments here put forward is great observation, let's keep it genuine and authentic.
[0:11:25.64] Integrity is key.
[0:11:27.76] And then you know we we're talking about fake reviews, feedback etcetera.
[0:11:32.96] What do you guys think?
[0:11:36.52] Yeah, I I don't want to, but yes.
[0:11:40.52] No, no, just joking.
[0:11:42.44] This is this is interesting, I think.
[0:11:44.48] I think.
[0:11:45.28] And this is something I think that is very relevant to me, something I experiences.
[0:11:50.76] How important it is to actually be ethical and versus look ethical because I think there's a difference.
[0:11:57.88] One of the one of the challenges I'm facing right now, I'm getting lots of accusations of doing unethical business practices just because I'm running a Black Friday discount campaign.
[0:12:08.76] Like people see prices, changes in prices as something morally difficult.
[0:12:15.32] And look, I sort of.
[0:12:17.36] I see where people are coming from.
[0:12:23.08] Is different people paying a different price for the same thing?
[0:12:27.72] Like I understand putting myself in a customer's shoes, if I paid $100 for something and the next day it's $50.00, I will feel some negative emotion.
[0:12:36.12] Like I get it, but is it unethical or?
[0:12:40.92] And even if it is ethical, is it important to make it look ethical?
[0:12:45.2] Like, yes, not yesterday, a couple, a couple of days ago, I had like a lawyer tried to cancel me on Twitter.
[0:12:50.64] He was looking at my web archive and he told me, look, your prices, you're you're inflating your prices.
[0:12:57.84] It's not a two discount.
[0:12:59.32] This is against the California law.
[0:13:01.84] Like some company got sued for $10 million.
[0:13:05] Whatever.
[0:13:05.72] You think things like that, right?
[0:13:07.68] And it makes you think right?
[0:13:10.32] I still believe again, I'm not doing anything wrong, but it certainly makes you pause.
[0:13:18.6] Yeah, that's that's definitely interesting.
[0:13:21.32] And I mean this is so hopefully by the end of today's talk you'll be able to to articulate it in a in a better way because you'll you'll be able to understand where these accusations are coming from.
[0:13:33.28] I'm going as I said to propose only three frameworks.
[0:13:35.68] There are other frameworks as well that we can put forward or that we can discuss, but these three frameworks are usually the most popular and widely cited.
[0:13:44.96] I left a few out because otherwise this would drag on and because some of the other theories can be quite complicated, like for example the Justice theory, which is the one that I'm assuming Barnes and Noble tried to adopt.
[0:13:58.72] I don't know the the story behind it, but I'm assuming it's it's a justice thing and social justice thing but.
[0:14:04.08] This, this, this.
[0:14:04.92] One of the affiliation is also an interesting one.
[0:14:06.84] That's where, for example, you might notice.
[0:14:09.8] From a customer, somebody with an address from at amazon.com@google.com and then you claim that you have customers from Google or Amazon.
[0:14:17.84] That is a very common thing, which again I understand it works because it's a person seeing your product, seeing it used by a well known company, it creates credibility.
[0:14:31.36] But is it?
[0:14:33.72] Is it is it ethical?
[0:14:35.48] Right.
[0:14:35.72] Actually, is it really, truly being used or is it implying some sort of association, which I think this is the accusation here being made and there are so many, you know, sometimes to be honest with you, even testimonials.
[0:14:49.36] I wonder like I whenever somebody leaves a good view for me publicly, I put it on my website only if it's public, that only if they put it on Twitter, whatever.
[0:14:58.76] And I never had problems with this, to be honest.
[0:15:00.72] I don't think I ever had somebody.
[0:15:03.72] Complain.
[0:15:04.52] But I wonder if, again, there's a line there where people don't want to be associated with me, necessarily because I took their tweet and I put it on my side.
[0:15:14.84] Like, am I implying something stronger than they wanted to do?
[0:15:18.64] Now there's all these things that again, like, I want to be fair, but I also want to use my all the advantages that I get because a testimonial works a lot, right?
[0:15:29.76] It's a very powerful thing.
[0:15:31.96] Yeah, I presume it did the same thing.
[0:15:33.68] Yeah.
[0:15:33.88] Go ahead, Hassan, Testimonials Hassan, Yeah.
[0:15:37.88] Yeah.
[0:15:38.04] No, I just wanted to kind of piggyback on that.
[0:15:40.48] So the, the, the tweet that we're looking at here, I think the key differentiator probably is the implication, right.
[0:15:47.84] So I think if you specifically say Google, Amazon, Microsoft are my customers and they're not.
[0:15:56.08] That's one thing.
[0:15:57] But then just having their logos on there without any sort of implication that you spoke there or the user products, that's where the line gets blurred because it could edge on, well, it's not illegal because you're not lying, but you're being unethical.
[0:16:12.48] Misrepresenting.
[0:16:13.68] You're misrepresenting the truth.
[0:16:16.28] Yeah.
[0:16:16.56] Yeah, based based on what?
[0:16:20.28] So what's the?
[0:16:20.72] Problem.
[0:16:21.24] Well, well, based on again based on the GET somewhere.
[0:16:25.6] Based on the likely assumption that if they see the logo of Amazon as your customer, people will think that the Amazon company made an order for your product and bought it.
[0:16:35.44] Like where it's not like just some random person who just happened to use the Amazon e-mail address.
[0:16:41.16] That which is a much weaker link.
[0:16:44.08] And again, it's it's it's it's it's bloody right.
[0:16:46.28] It's not and based on what would would this be wrong?
[0:16:51.24] It's also norms maybe.
[0:16:53.28] I mean, yeah, exactly right.
[0:16:54.48] It's a common, expected, common, common understanding of what this means that you're abusing.
[0:17:02.84] This is where I think the ethical framework frameworks would actually help, because if I ask you based on what you're examining this.
[0:17:11.64] So this is where these tools would help you articulate your thoughts better.
[0:17:16.08] Because this is the problem with ethics.
[0:17:18.48] It's like we all have a set.
[0:17:19.76] So Hume, David Hume, Scottish philosopher, said ethics is just a faculty that we developed over time.
[0:17:29.48] And it's a sentiment.
[0:17:30.72] We don't know where it comes from.
[0:17:32] It's just a sentiment that we have.
[0:17:33.6] And so his entire theory is called the theory of sentiment, like Moral sentiments.
[0:17:40.52] And he gives the examples of incest and infanticide and petricide amongst animals like so this is normal in in the animal Kingdom, but human beings frown upon them and he could only explain that by saying it's a sentiment.
[0:17:55.6] So this is one kind of framework.
[0:17:57.72] We're not going to discuss this today, but I don't usually discuss it because it's you go over it in like 1 minute.
[0:18:02.88] So but this is so this is 11 framework but it's it's interesting to see you guys discuss this because and it's also interesting that you mentioned the is it more important to be ethical or to be ethical because the second slide is the following.
[0:18:18.68] I think you all are going to recognize some of these people or here.
[0:18:28.04] I have Periatic the whole of so these these people for those who Elizabeth Holmes SBF and then Caroline Ellison if I'm not mistaken allegedly his girlfriend SB FS girlfriend.
[0:18:43.24] I'm not gonna talk about Elizabeth Holmes.
[0:18:45.4] I you you probably all know her from thera knows they were claiming so many things it was a fraud.
[0:18:50.52] In the case of SBS, he was trying to look ethical, right?
[0:18:54.8] He he was an effective altruist.
[0:18:56.68] He wanted to show that, you know, I'm the most generous, generous billionaire.
[0:19:00.32] I give my money to everyone, I donate to charities, etcetera.
[0:19:06] Well, it was not his money, at least not not call him fraud and and all sorts of stuff.
[0:19:10.2] And I don't want to judge these people or discuss whether it's fair that they end up in prison or for for for life.
[0:19:18.72] But, you know, it's like what Daniel said, right?
[0:19:22.76] Is it, is it more important to seem ethical or to be ethical?
[0:19:27.32] It's like what what is it that we want?
[0:19:28.92] It's just curious to to see how these people who claim that they're, you know, authentic and that they're doing the right thing ended up all and.
[0:19:39.8] You know, I I wonder like I wonder for example, especially with with with Horn's site.
[0:19:44.8] I wonder if if if her product ended up working which?
[0:19:51.12] There was a possibility that presumably right.
[0:19:53.72] I mean I wonder what people would think like because for example Elon Musk gets accused of doing very questionable things especially with Tesla and he made extremely wild claims and over promised self diving cars and though about taxis and people paid for things that never really got and so and so forth.
[0:20:16.28] But he's sort of.
[0:20:19.04] Given like a blind eye, just because actually there's a real product and people like it, right?
[0:20:24.08] And sort of the good things make up for the bad things.
[0:20:28.04] But if Tesla flopped, it's almost like could almost be the same story, right?
[0:20:34.68] Of wild promises again.
[0:20:38.28] Maybe.
[0:20:38.68] Maybe Elizabeth, Because I don't even believe Elizabeth Thomas even had the intention to necessarily it.
[0:20:43.16] Was the claims as well, like you know in the case of Tesla he's even even though I don't know what Wilde claims he he put forward.
[0:20:52.76] But then in the case of Thera Nos, it's as though they solved the biological problem by sending you I think a small kit where you do it at homicide.
[0:21:03.44] It's not only a wild.
[0:21:05.28] Claim it was I think falsified data if I'm not mistaken.
[0:21:09.52] Yeah yeah misleading investors and misleading investors, said Partition.
[0:21:12.84] Misleading customers who's even though she wasn't even found guilty on that but regardless.
[0:21:18.56] But but it's it's it's another it's another one of those things where like history has shown us a lot of people probably probably stretching the line of what is ethical to get to a means, and if they succeed, then a lot gets forgotten.
[0:21:38.2] That's that's true as well.
[0:21:39.64] This is, this is true.
[0:21:40.56] And this is This is why, like some of the principles, and This is why ethics is is also tricky, because there's a there's a fine line between what is legal, what is ethical.
[0:21:49.48] And then if if you end up in the right, everyone applauds, and then if you end up in the wrong, everyone starts accusing you of being.
[0:21:57.04] A douche.
[0:21:57.64] And of being a fraud and of being you know, you do like you're you're dishonest.
[0:22:05.64] And here are the ethical principles we we usually have in our minds when we think ethics, trust, integrity, honesty, transparency, build in public kind of things.
[0:22:16.48] Autonomy, respecting people's dignity, respecting other people, your customers as well.
[0:22:21.76] So when we're talking about misrepresenting or misleading.
[0:22:25.04] Well, we need to respect other people.
[0:22:26.72] We need to count would say treat them as ends in themselves, not as means then.
[0:22:31.16] And accountability, responsibility.
[0:22:33.96] So for example, all these people were held accountable.
[0:22:37.2] I would say, for example, in the case of the financial crash, many people were held accountable, accountable 2008 ended up in prison, but then many people did not.
[0:22:47.16] And so this is also the questions, like some people always tend to get away with it and nobody cares.
[0:22:53.84] Duty.
[0:22:54.52] We talk about doing no harm in the case of medical doctors.
[0:22:58.48] And the case you are, you're putting out a product.
[0:23:01.92] You want to ensure that your product is safe.
[0:23:07.72] Legal actions get blended in there, unexpected consequences, harm, financial losses, etcetera.
[0:23:14.76] So ethical principles and considerations usually emerge around these topics.
[0:23:21.92] But So what is?
[0:23:23.16] What is ethics?
[0:23:24.44] Sasha?
[0:23:24.88] Did you ever?
[0:23:25.44] Yeah.
[0:23:26.8] What I know about these are all the examples where like the legal aspect or the other negatives are somewhat aligning with unethical behavior.
[0:23:36.32] But I think it's much more interesting to see like a case for example when someone.
[0:23:41.16] Is someone gets gets punished legally but we would all say he acted in an ethical way so or or we're getting there we had already like Ilan maybe or people like that who got didn't get caught or got away with something.
[0:23:54.2] But also the especially when we all feel with someone and empathize because we see that someone gets punished but he he acted completely act ethically.
[0:24:02.52] But maybe you will get to that.
[0:24:04.32] Yeah, yeah, we'll, we'll, we'll get there.
[0:24:05.92] I have a couple of examples of my sleeve so.
[0:24:10.44] But but before, before we even get there, it's like, why?
[0:24:13.36] What?
[0:24:13.76] What issues?
[0:24:14.84] Because Daniel was talking about these issues, right?
[0:24:18.8] What would be considered ethical?
[0:24:21.6] Fluctuating prices, changing them, and etcetera.
[0:24:24.68] Is this unethical?
[0:24:26.68] Is this a topic that should be evaluated from an ethical perspective?
[0:24:30.4] For example, pineapple pizza.
[0:24:33.08] Some people would, you know, go crazy.
[0:24:34.72] Oh, that's that's a wrong thing to do.
[0:24:36.44] But is this an ethical matter?
[0:24:38.28] Like, should we look at it from an ethical perspective?
[0:24:41.4] Some of you might have strong opinions on that, but I'll leave that to you deciding which small bet to pursue.
[0:24:48.64] Like if you have 5 projects you want to work on deciding on, you want to choose two.
[0:24:54.8] Does ethics get into the picture?
[0:24:57] In this case, lying to others.
[0:25:02.6] And so usually an ethical an issue becomes ethical when it involves some sort of harm.
[0:25:11.4] It may cause harm, or it involves other people, and it may affect them one way or the other, but not only other people and oneself, right?
[0:25:20.32] That's there is usually a consideration of.
[0:25:24.12] Maybe this could cause harm, maybe this could lead to unintended consequences.
[0:25:32] And this is when ethics would come in.
[0:25:34.2] So in the case of pineapple pizza, whether or not you like it, at the end of the day, it's to our knowledge, it's just a matter of preference.
[0:25:40.92] It's not going to hurt anyone, right, Deciding which small bet to pursue.
[0:25:46.32] If you hedge your financials, it's a financial risk.
[0:25:48.96] It's not even an ethical issue, right, because you're, you decide which, which, whatever house you want to buy.
[0:25:56.12] Also use of certain technology, like, yeah, yeah, yeah, yeah.
[0:26:00] So.
[0:26:01.84] Usually even within these principles that we are discussing, like sometimes we agree on a principle right.
[0:26:08.16] Like in the case of doctors do no harm or in the case in in general, we all don't want to do harm.
[0:26:14.72] But we also have the principal don't lie when these principles conflict.
[0:26:20.8] So if by telling the truth you're going to be harming someone, what would you do in this case?
[0:26:28.84] So it's it's complicated, right?
[0:26:30.96] Would you?
[0:26:31.6] Would you, would it be OK to lie, to do no harm?
[0:26:36.76] Or would you tell the truth because you don't want to lie, but then you may cause harm in the case of cultural relativism.
[0:26:44.36] We live in a global village nowadays and this is something that many business leaders from my interviews mentioned.
[0:26:52.08] It's almost they all agree on this when when part of doing business involves exchanging gifts with a certain culture.
[0:27:01.44] And it's wrong where you're based.
[0:27:03.8] So if your company is based in a place where exchanging gifts is wrong, that's considered bribery.
[0:27:10.52] But then you're meeting with a client who's part of his tradition is exchange gifts.
[0:27:17.44] What would you do in this case?
[0:27:19.04] Right?
[0:27:19.24] And I'm using this as as the simplest example.
[0:27:21.68] I don't want to get into the more complex ones.
[0:27:26] One other thing that complicates ethics is that.
[0:27:29.52] Moral standards evolve.
[0:27:32.08] Ancient Greeks had their moral principles right.
[0:27:35.6] And we usually look back and say, oh, these people were unethical.
[0:27:39.56] It doesn't really work that way because back in the day, these were the ethical standards that they had.
[0:27:45.16] Today we have certain ethical standards 100 years from now.
[0:27:48.72] They look at us and they'll say, oh, look at this.
[0:27:52.96] Lawyer, idiot who accused Daniel that he is engaging in on ethical work And, you know, so it's these standards, yeah, yeah, slavery up until today, by the way, slavery is, is legal, including in places like Lebanon.
[0:28:15.08] It's it's, it's it's there.
[0:28:16.8] It's it's a practice.
[0:28:18.32] It doesn't take the same form or shape as before, but it's there.
[0:28:22.88] So yeah, it's it's complicated.
[0:28:26.44] But what I can tell you is ethics in general.
[0:28:29.6] Because it's it it it guides us.
[0:28:32] It underscores everything we do.
[0:28:33.88] It informs our legal, religious and social norms.
[0:28:39.92] It involves the individual, the society.
[0:28:43.16] And it's a continuous attempt to at least try to find out which behavior is right, which behavior is wrong.
[0:28:51.96] Which behavior is OK.
[0:28:53] Which behavior is not OK.
[0:28:54.48] And so as you might have seen things evolve at one point maybe we didn't know that smoking harms but then we we it, it was an undeniable fact.
[0:29:06.2] But then companies would engage in you know data manipulation etcetera in order to win etcetera, etcetera.
[0:29:11.28] You know, you get to just put it's an ongoing kind of activity where we try to figure out now Danielle has three or four ethical dilemmas.
[0:29:21] He's.
[0:29:21.84] Trying to deal with maybe the others also, you're you're you're facing these issues in my case as well, right?
[0:29:28.72] When it comes to philosophy courses or the things that ioffer it's, it's like you always think, well, is this OK?
[0:29:37.24] Is this fair?
[0:29:38] Is this cool?
[0:29:40.16] And this is maybe why I tend to use the marketing that I use, because I don't want to promise people anything.
[0:29:47.4] Because I don't want to feel like I'm being some sort of fraud in a sense, right?
[0:29:52.8] But then I don't know.
[0:29:55.12] So it's it's complicated.
[0:29:58.4] And this is where the thought experiment comes in.
[0:30:01.4] So it's the trolley problem, Just it's I'm.
[0:30:05.92] I'm merely interested and I would like all of you, even if you don't want to, to jump in via audio, you can just write it in the chat.
[0:30:15.56] This is a thought experiment that helps us see our responses the the ethical dilemmas.
[0:30:20.68] And the reason why this is helpful is because you will start identifying which frameworks you usually tend to use, and then we'll discuss these frameworks and we'll take it from there.
[0:30:32.24] So has anyone here not heard about the the trolley problem before or seen it?
[0:30:40.72] I've seen it, but I don't think it hurts to explain it to us.
[0:30:46.48] So it's it's just a picture yourself standing on a railroad, railroad, railroad track.
[0:30:53.8] So you're here and there's a junction and you observe that there's a trolley coming down your side if it goes forward, if you don't do anything.
[0:31:08.28] So it's it's on its course the colliding with.
[0:31:12.92] Five people and it's going to kill them.
[0:31:16.88] But if you intervene, if you pull the lever, it's going to shift its course and it's going to be killing one person instead.
[0:31:25.36] So your choice here is to not do anything and let five people die.
[0:31:31.68] And notice the wording here, you're not killing them, you're letting them die.
[0:31:36.44] You're not doing anything or interfering, pulling the lever.
[0:31:42.96] And killing one person.
[0:31:47.48] How many of you would pull the lever?
[0:31:53.28] No one.
[0:31:54.16] OK, Sasha, why would you pull the lever?
[0:32:01] OK, I could come up with a bit of a of a maybe sarcastic solution to because if I if I if the five guys are dead, I have maybe one friend who's alive.
[0:32:11.92] But if I kill the one guy, then yeah, I have 55 friends whose life I saved and who will be will be very grateful.
[0:32:23.24] So you're.
[0:32:24.16] Pragmatic explanation.
[0:32:25.56] You're being.
[0:32:26.6] You're being basically pragmatic.
[0:32:28.36] In this case.
[0:32:28.84] You're saying I would.
[0:32:29.8] I would save.
[0:32:30.6] I would be saving 5 lives.
[0:32:32.16] I would be killing one person.
[0:32:33.44] It's OK.
[0:32:33.92] In case you pull the lever legally, you will be charged with murder.
[0:32:37.92] OK, Vlad, but let's assume this is not the case.
[0:32:41.2] Let's let's let's assume we don't have a law for that yet.
[0:32:46.84] So what would be your choice?
[0:32:49] I'd hope they could become the.
[0:32:53.28] Interesting.
[0:32:54.28] Yeah.
[0:32:54.64] I mean.
[0:32:55.24] I think the context of who can you see the people, because if one of them is a child, you're kind of going to go to save the child maybe as opposed to, you know, do you know what I mean?
[0:33:07.36] The context of who the people are, I think is important.
[0:33:10.96] The context is definitely important.
[0:33:13.24] I'll I'll add layers of complexity shortly, but just as is, you don't know anything, you don't know the people.
[0:33:21.6] I think I do nothing.
[0:33:23.16] You do nothing.
[0:33:23.88] Why?
[0:33:24.32] I think I I think I do nothing because I I think that the the guilt, the guilt of Commission is much stronger than the guilt of omission.
[0:33:34.8] I would say possibly.
[0:33:39.6] Could you try to think of another reason?
[0:33:42.72] Let's not think in terms of guilt.
[0:33:46.8] Think about your action.
[0:33:49.84] Well, inaction is easier to justify than action.
[0:33:55.4] OK, so but in this case so you you don't want to do the wrong thing which is killing someone.
[0:34:04.8] Is that why?
[0:34:07.72] Yeah, I mean, yeah, because I'm actually I've, I've, I've influenced the death of a person by something I have done, whereas I can kind of say, well, I didn't do anything.
[0:34:19.6] When the five people die, I'm very sad for them and their families.
[0:34:22.16] But I didn't do anything.
[0:34:23.48] I could have done something, but I didn't.
[0:34:25.84] But in any case, you make a decision.
[0:34:27.4] I mean, it's if you act or not.
[0:34:29.08] You can't.
[0:34:29.32] You make the decision to make a decision.
[0:34:34.4] Yeah, there's there's a decision, yeah.
[0:34:36.76] And usually I'm I'm not going to talk about the dropping the guy off a bridge.
[0:34:44.24] The the other example that that is usually given is assume that this is not a train.
[0:34:49.28] Assume that you're a medical doctor and you have 5 patients.
[0:34:55.28] They all need organs that survive.
[0:34:58.8] And then here comes a very healthy person.
[0:35:02.16] What's what's this guy called?
[0:35:03.44] Brian Johnson?
[0:35:04.32] No, it's it's him.
[0:35:06.76] So you decide.
[0:35:08.68] Harvest his organs.
[0:35:09.68] What?
[0:35:10.76] Yeah, exactly.
[0:35:11.52] So imagine, imagine you're the Doctor and you want to save.
[0:35:16.08] So by the very same, in this case, it seems like nobody would choose to harvest the organs of a healthy person, right?
[0:35:22.68] It seems like an easy well, because there is something intangible that we understand that it's unfair and that this is the just the circle.
[0:35:31.04] I think.
[0:35:31.36] I think there's something interesting in the circumstances of life like that.
[0:35:35.16] Yeah.
[0:35:35.56] Like there's five unhealthy people.
[0:35:38.12] It's accepted to be just just fate, right?
[0:35:41.6] Again, it's part of societal norms, like you said, right?
[0:35:44] I mean, animals will behave differently and more.
[0:35:49.08] You know, we see all all all the time like sort of sort of birds dropping their their siblings out of the nest and other things like that.
[0:36:00.04] But we humans don't do these kinds of things right.
[0:36:04.24] I think it's.
[0:36:05.88] I mean, I I, I've heard of this quite a number of times, taken a few.
[0:36:09.28] I mean it's it's really the framework they mentioning.
[0:36:12.8] Yeah, yeah, let's.
[0:36:13.88] Sorry, go ahead.
[0:36:15.4] No, no, I I do, I do mention that also in in the book as well.
[0:36:20.08] So, but yeah, go ahead.
[0:36:21.12] OK, yeah, yeah.
[0:36:21.8] I mean, it's just, you know what, what are the consequences for society in one of them, right?
[0:36:27.76] Like 5 people being alive versus one?
[0:36:30.52] And it's an easy answer.
[0:36:31.68] It's You kill the one person for the greater good of society.
[0:36:35.8] As you know and that consequence.
[0:36:38] But then the flip side of that is the categorical categorical moral theory which is the ACT itself feels strong.
[0:36:46.32] So in the case of the patient pulling the plug and killing them, that alone, regardless of what it means for society as a framework just is blatantly wrong and unacceptable based on social norms.
[0:36:58.64] So always a dilemma of.
[0:37:01.08] Those things it's it's.
[0:37:02.44] This is precisely it, and This is why.
[0:37:05.24] It's always interesting to to use this as a starting point and This is why philosophy people like this example, because it it contextualizes things such that it provides you with the very basic frameworks that we usually use.
[0:37:21.8] Because sometimes when if if you're not aware of of of how you're thinking or reasoning about these issues, you might say, for example, it's it's just like human it's it's a sentiment I have.
[0:37:33.72] Or maybe there's there's something that I I cannot really point out what it is but in fact and I don't know what your opinion will be when when we discuss these usually when you're evaluating these these actions you're you're employing these frameworks that hasn't just mentioned right?
[0:37:57.6] When you're when you prefer killing one person and saving five people.
[0:38:03.96] You're examining, not the action itself.
[0:38:06.92] You don't care about killing someone.
[0:38:09.36] You're looking at the consequences of your action.
[0:38:12.6] You're evaluating whether killing one person will be more beneficial than letting five people die, and so you don't care whether the action is good or bad.
[0:38:24.28] Is this killing, murder, legal?
[0:38:26.12] Illegal.
[0:38:26.52] I don't care.
[0:38:28] What are the consequences of my action?
[0:38:30.72] I'm saving 5 lives.
[0:38:32.8] And I'm killing only one.
[0:38:34.68] And so the pros and cons is, well, there are cons of course that killing one person but then there's only they're very limited as compared to saving 5 lives, you'll make their family happy, it's better for the greater good, etcetera.
[0:38:51.68] We can add complexity to it and say what if the the person on.
[0:38:56.84] You're going to be killing is Einstein and the five people include Hitler and stuff.
[0:39:01.76] I don't want to get there, but this is like if if we want to get nerdy about it.
[0:39:05.8] This is just to illustrate the concept, right?
[0:39:09.24] So for you, you're not looking at your action.
[0:39:13.28] I like the framework of reflection.
[0:39:16.24] Interesting.
[0:39:17] This is yeah, this is the silver rule.
[0:39:25.48] Yeah yeah.
[0:39:25.8] This is this is actually don't treat.
[0:39:27.56] Others the way you don't want them to treat you.
[0:39:30.2] I like this because this is what I use for my business model decisions, in fact, right?
[0:39:34.4] I mean, would how would this affect me?
[0:39:36.4] And I know it's an imperfect model, but I think it helps me a lot with at least feeling OK with the decisions I'm making, even though others take the.
[0:39:43.96] Yeah, the there's an issue with this and its scale because you're you're also assuming how you want to be treated.
[0:39:52.2] But then.
[0:39:53.56] Others you're you're.
[0:39:54.6] You're assuming that others would want the same, but then in this case you're not respecting their autonomy because they might disagree with you and so.
[0:40:01.6] But if you decide to to not kill that person and let five people die, you're not looking at the consequences of your action, you're looking at the action itself.
[0:40:14.92] And in this case, if you don't want to kill one person, just like Shane said, what you're saying is killing someone is wrong.
[0:40:23.6] I don't this is what what he was trying to say with with, you know, conscience and and intervening versus not intervening because there's something about murder that feels wrong.
[0:40:38.16] And this was even more salient and obvious with the case of the Doctor, because just like Daniel said, I don't want to be killing this healthy person because there's something wrong about the action itself just to save five other lives.
[0:40:53.56] And so you're you don't care about the consequences, you may as well be saving 10 lives.
[0:40:59.32] What you're doing here is you're trying to look at the action itself.
[0:41:02.76] If it's wrong, you're not going to interfere.
[0:41:07.28] So isn't actually back to Dennis's point.
[0:41:10.04] Like it Doesn't this value actually solve a lot of this?
[0:41:12.8] Because if I were dying of heart failure.
[0:41:16.12] I wouldn't have the expectation that the healthy person would give me their heart like, I mean nobody.
[0:41:20.56] You wouldn't have the expectation, but wouldn't you want?
[0:41:23.6] No, no, no.
[0:41:24.44] I mean, but it's no, no, no.
[0:41:26.12] But no.
[0:41:26.92] I wouldn't want either.
[0:41:28.28] It seems almost.
[0:41:29.16] I know many others who would.
[0:41:31.4] I doubt many people harvest it's.
[0:41:35.48] Yeah, OK.
[0:41:36.24] But I think those, but I think it's extremes of of course of course.
[0:41:41.2] But that exists.
[0:41:42.2] But I think the industry there, OK, but I think the rational, normal person who is dying of heart failure wouldn't expect a healthy person to give them their heart, right?
[0:41:54.92] I think that's.
[0:41:56.84] Non controversial statements.
[0:41:59.12] What if we take it a step further and to discuss without getting into the details again, COVID and ERS, whom would you attend to?
[0:42:14.12] 1st old People?
[0:42:16.4] Young.
[0:42:16.68] People.
[0:42:20.72] Again, is it as we feel all the best things of of?
[0:42:25.56] Of no.
[0:42:25.88] Yeah.
[0:42:26] Yeah.
[0:42:26.24] I mean, and you have two people, 1 old person and one young person at the ER.
[0:42:30.64] It's look, I think even there the Silverado will almost maybe not the Silverado.
[0:42:39.16] But let me let me tell you something and this is this.
[0:42:41.68] I say it in private because I feel bad say it publicly even we're saying.
[0:42:45.72] Like I I think we all feel sad there when like a young person dies compared to an older person dying, for example, right.
[0:42:53.68] I mean it's more, it's feels more of a tragedy when it's 10 years old versus a 90 years old probably we all feel somewhat the same.
[0:43:02.28] So it seems like that sentiment again like to use the same place that you use, the fact that I feel sad there that it's a 10 year old versus.
[0:43:09.88] Yeah.
[0:43:10.76] Because of the potential of of more So what you're looking here at are the consequences of your action and not the action itself.
[0:43:17.72] So if you're looking at old person, I need to save them.
[0:43:21.36] It's in in your mind.
[0:43:22.92] What you're saying is this person has a human dignity and I it's my duty to save them.
[0:43:28.32] So you care more about the action than the consequences.
[0:43:31.08] If you're looking at it from a a young person deserves maybe to live because they have their entire life ahead of them.
[0:43:39.48] You're looking at the consequences of your action because you're you're basically saying what are the pros and cons of letting this person die and attending to the other person.
[0:43:48.68] So you're right, like it's this is the thing.
[0:43:50.72] It shouldn't be bad to voice these opinions out because this is what we do in in philosophy classes.
[0:43:57.72] This is these are normal discussions and these inform the frameworks of these ethical theories.
[0:44:05.32] Whether or not you agree.
[0:44:06.32] And this is This is why it's important.
[0:44:07.92] Because if you're again, if you're saying no, I would save the young person's life.
[0:44:11.92] What you get about are the the consequences.
[0:44:14.68] You're you're looking toward the future and you're saying there's there's a lot more chance that this person will be productive.
[0:44:20.68] Let their life be happy.
[0:44:22.44] You're contributing to the greater good of society.
[0:44:25.16] Whereas if someone is already old, different story.
[0:44:28.48] But if you're like Naseem in this case, and this is a true story, like this is what his opinion is, you would rather know it's it's our duty to save these people.
[0:44:38.76] It's because you're looking at the action itself, and the action is what you think doing is right and in this case save.
[0:44:45.6] Is is that what you think Naseem is doing?
[0:44:47.4] Because I I always felt I agree with you.
[0:44:50.2] I always thought there was like a a gap there.
[0:44:52.88] But I I on the other side, I think upon reflection, I felt like he was just the acting to.
[0:44:59.96] The side of the people who are saying, well we shouldn't be doing any lockdowns or doing any intervention because this it's only affecting old people and those people are going to die anyway.
[0:45:10.92] So they were already fragile on the blink of that and I felt like he was balancing that, like we shouldn't be saying that right?
[0:45:18.2] I mean an 80% eighty year old dying is still deserves some dignity of like.
[0:45:24.56] Some, the community doing some small sacrifice at least.
[0:45:28.08] Whether well, whether it's small or big is another issue to prevent.
[0:45:34.16] I mean, I remember when COVID broke out here in the United States, very close to where I lived, there was a nursing home, like 50% of the people died.
[0:45:41.2] Like it was quite crazy to think about how, how, how especially the early variants, right.
[0:45:48.44] So it was brutal for the elderly.
[0:45:52.88] But what was what do you think Nasim was saying like it was it was it more philosophical, like that our actions should be indistinguishable, whether we're saving a 10 year old or an 8 year old or more.
[0:46:04] Balancing that, you know, it's not just, well, it's just an old people's problem.
[0:46:10.08] So we shouldn't be inconvenience that that young people shouldn't be inconvenienced because this was that was my leading.
[0:46:18.32] That it was selfish for the young people to say I just want to go to parties and who cares?
[0:46:25.48] Yeah, but This is why.
[0:46:28.24] So it's curious because Nasim doesn't really like Kant.
[0:46:32.12] And then I saw his argument to be a lot.
[0:46:36] So what does Kant do?
[0:46:37.04] Because I I'm not talking.
[0:46:38.2] We'll get we'll get to him.
[0:46:39.88] But I'm I'm mentioning that he hates Kant.
[0:46:41.6] It's because OK it's it's curious how ethics is, is very complicated because you are put in a situation whereby you will have to contradict yourself because it's so all of a sudden I think he was on the one hand bothered by how people were interpreting this antifragility because they were not taking into consideration the ethical dimension.
[0:47:02.88] And I think the ethical dimension Naseem is very theory of of action driven.
[0:47:09.64] He looks at what is right.
[0:47:10.92] He's obsessed with what he's obsessed with.
[0:47:13.32] Two theories.
[0:47:14] We're going to be looking at them today, the virtual theory and inadvertently I don't think he knows it but he's he's an action driven person.
[0:47:22.12] He's if the action is right I do it.
[0:47:24.4] This is why it's like you do no harm.
[0:47:26.64] This is where GMO come in, because the other argument is, but they're saving lives and they're like Jam OS are helpful, etcetera.
[0:47:33.68] So Jam OS, you look at the consequences of your action and the same as like no, do no harm or avoid harm at all cost, don't kill people.
[0:47:42.12] So he's he's very rigid but and by rigid I mean it's the theory of action.
[0:47:48.24] He looks at the action itself.
[0:47:49.84] He doesn't care about the consequences.
[0:47:52.12] He does look at the consequences, of course, like he there is a dimension there.
[0:47:55.72] But his main framework looks at the action before the consequences.
[0:48:01.24] At least this is what I think but this.
[0:48:04] Is, which is I think is an interesting dimension which applies to business, right?
[0:48:07.68] Because in business you almost can't ignore the consequences if you want to remain profitable precisely.
[0:48:13.76] And This is why I was telling you need to look at the field and you need to look light as well, right?
[0:48:19.2] So yeah this is this is the problem it's like and and the problem is is is for people like Naseem.
[0:48:26.32] This is why he's it's it's very difficult for him to to to even This is why he hates all the people he hates and he doesn't want to deal with.
[0:48:35.32] It's it's because he's very righteous in his approach.
[0:48:38.68] But it's not only Naseem, there are other people who are also righteous and it affects them and others as well.
[0:48:43.48] You know cancel culture and etcetera.
[0:48:45.08] They use a different framework as I said.
[0:48:46.56] But it's it's the same it's but it's tricky.
[0:48:50.08] And why is it tricky?
[0:48:51.56] It's because even though like we just saw two arguments, right.
[0:48:55.76] You would save 5 people and kill one because it's better.
[0:48:59.56] And then the other argument is you would kill, let five people die, but then don't you would save one person, 2 arguments.
[0:49:11.44] Yeah, I'll, I'll get the two.
[0:49:12.96] Hassan, just I'll, I'll finish this idea.
[0:49:16.16] So these two arguments, the conclusions are different.
[0:49:22.44] But then usually people fight over the conclusion.
[0:49:24.92] It's like they start calling each other names.
[0:49:26.76] No, you're a killer.
[0:49:27.76] No, you don't care about other people you don't care about.
[0:49:30.52] Like the youth.
[0:49:31.16] You're selfish.
[0:49:32.16] It's not.
[0:49:33.12] You're neither this nor that.
[0:49:34.4] It's the conclusions are correct, Like both conclusions make sense.
[0:49:39.76] What is different is the framework you're starting from.
[0:49:43.8] And so if if you get into a debate, ethical debate, what you need to look at is not your decision.
[0:49:51.32] What you need to look at is the framework that you're using.
[0:49:55.04] Because more often than not, people jump between frameworks and they don't even know which frameworks they're using.
[0:50:02.56] They don't even care about about these things.
[0:50:05.56] And these frameworks are pretty established when, when we're talking about the right to healthcare, there's an underlying framework.
[0:50:13.32] When we're talking about free market, there's an underlying framework.
[0:50:16.92] When we're talking about consequences, there's a framework, etcetera.
[0:50:20.92] So the problem more often than not is not the argument itself.
[0:50:24.84] It's the starting point, the framework that we're using and the assumptions that we have.
[0:50:30.12] And so if you get into heated debate with others, I would urge you to not look at the conclusion, forget about the conclusion, try to see where this is coming from.
[0:50:40.76] And that's why I was asking you at first, why do you think this or that is ethical or it's wrong to do this or it's wrong to do that.
[0:50:48.12] We're going to look at them in in more detail now also very briefly with clear examples to to to clarify this issue.
[0:50:57.28] But this is I think the main take away of today's session.
[0:51:00.28] Like, this is what is always interesting when it comes to ethics.
[0:51:03.24] The debates we have, the heated debates and the blocking each other.
[0:51:08] And the shit talking is because you start judging people based on their conclusion.
[0:51:16.04] But more often than not, their framework makes sense.
[0:51:20.4] So if you want to change someone's mind, let them look at it from a different perspective, a different framework.
[0:51:26.64] Tell them, well, maybe let's let's try to examine this from this perspective.
[0:51:30.76] Let's see what you think.
[0:51:31.76] Etcetera, etcetera.
[0:51:33.16] Yes, Hassan.
[0:51:35.24] Yeah, that was actually segues into the point I was trying to to to make earlier, which is even the consequences.
[0:51:40.72] So I think Tim Ferriss mentioned this at one point where he's kind of talking about do you donate to save the whales or do you donate to Save the Children in a developing country?
[0:51:51.32] And you know, if if you ask any average person, they'd be like, yeah, you know, human life is more important than a, than a than a whale's life.
[0:52:00.4] But then he kind of puts the argument that.
[0:52:03.08] You know that whale you save might end up cuting cancer as a second third order effect, and that kid you saved in a developing country might end up holding an AK47 and shoot your cousin in a few years, right?
[0:52:14.64] So.
[0:52:15.24] So even the consequences, I think even even thinking about a framework, the conclusion is sometimes even debatable too.
[0:52:22.6] And I just thought that was interesting.
[0:52:25.16] Definitely.
[0:52:25.88] Definitely it's this is, This is why it's complicated.
[0:52:29.64] And This is why I think people try to avoid ethics because it's it's not easy and it's not easy not because the subject is difficult to understand.
[0:52:38.8] It's as you see it's very there's there's there's no rocket science there.
[0:52:42.28] It's like it's it's very easy, straightforward.
[0:52:44.56] The frameworks are very clear.
[0:52:46.96] As you see the examples are extremely easy to to relate to and understand.
[0:52:52.44] It's just thinking about the consequences and evaluating them gets you in a point whereby if you really start thinking about it from different perspectives your your opinions and belief system will change.
[0:53:08.68] Sam Harris has talked about the idea of distance and these questions.
[0:53:11.36] Yeah it's easier to save one child who's 10 years away than who's 10 yards away than thousand children the other side of the world and also this is this is important to keep in mind scale and and the effect of of what you're seeing like if you are if if if you're not going through war etcetera.
[0:53:33.92] You don't really care if if the ethical issues like in the case of Danielle you're you saw like this senior residence it's it's right next to you you've seen it if if you're in a place where there are no old people or it's relatively young they wouldn't care.
[0:53:54.48] They're like, yeah, they're lying.
[0:53:56.44] No, no one died.
[0:53:57.56] Everyone is healthy and etcetera, etcetera, you know, so it's scale is important.
[0:54:01.76] That's not the same if you're a solar printer than if you run a or if you're AVP and a multinational company.
[0:54:09.6] And then you have all sorts of issues, things that are not that important, like someone stealing office supplies to something as extremely dangerous as, I don't know, harassment or discrimination or somewhat dangerous between brackets.
[0:54:27.52] It's something extremely serious.
[0:54:31.88] So this is, This is why it's it's really difficult.
[0:54:36.16] And the contexts are different.
[0:54:38.04] If you're a medical doctor, your ethics is way different than if you're in business, way different than if you're an engineer.
[0:54:46.16] But the principles are the same, right?
[0:54:48.96] So when we're talking about the medical doctor not doing harm versus an engineer not doing harm, where the the principles are the same, the application is different.
[0:54:57.88] And these principles come from these domains of ethical analysis.
[0:55:04.08] The first one we're going to be the first framework, as I said, looks at the consequences.
[0:55:08.76] One example is utilitarianism.
[0:55:10.76] We've already did something similar, but this is from Crimson Tide the movie, and you have Denzel Washington here.
[0:55:20.6] He needs to make a decision in a split second.
[0:55:22.88] The bilge Bay is so the the submarine got hit by a torpedo.
[0:55:30.4] There's some damage in the Bilge Bay, and there are, as you see, three people trapped.
[0:55:37.76] The other one is already dead and the submarine is sinking.
[0:55:44.08] It's going to implode at any point unless they seal the bilge Bay, which is the compartment that handles water.
[0:55:54.12] What would you do in this case like this is this is a very realistic example.
[0:56:00.88] It might not happen to us, we might not be there, but it's you can relate to it in a sense, right?
[0:56:06.36] You have to make a decision.
[0:56:07.56] If you don't seal it shot, all of you are going to be dying.
[0:56:12.68] If you order them to seal it shot, you know that two or three people will be killed.
[0:56:22.4] I think there are protocols for these things that I mean they I mean in in wartime and whatever they happen so often that I think the protocols is to save as many lives as possible, which in this case would be, you know, to shut and whatever, right.
[0:56:36.04] Yeah.
[0:56:36.24] But I think I think even even in the in the movie The Titanic, right, we saw the same thing, if you remember collected.
[0:56:42.16] They shot the chambers and there's like the.
[0:56:45.28] The engine people getting subbed there.
[0:56:48.88] So it's, I think, yeah, but then the.
[0:56:52.4] Protocol.
[0:56:53.2] To get to this protocol, it's like you're you're basically you're you're examining the consequences right?
[0:56:57.88] You're in this case you're saying would I sacrifice 2 people and this is not a trolley problem here.
[0:57:04.8] This is something that that happens frequently to a certain extent.
[0:57:10.28] Would you?
[0:57:11.76] And another example that is maybe more relatable, would you fire five people to save a company or would you?
[0:57:22.96] Yeah, no, it's all the time, right.
[0:57:24.24] I mean, it's not even that.
[0:57:26.56] I mean that's happens every day.
[0:57:31.4] And again, it's usually protocol.
[0:57:32.88] I mean it might not be protocol officially, but it's protocol managerial.
[0:57:40.6] Standard practice, but many people get angry, right?
[0:57:43.88] Like, yeah, yeah.
[0:57:44.44] So this is, it's even even if it's a standard practice because we're looking at the consequences, right.
[0:57:49.12] Because you're you're what you're looking here is I want to save the company.
[0:57:52.84] And so instead of asking everyone to get a salary cut, we're going to let go.
[0:57:58.76] A few people just like Elon Musk, maybe firing employees because the company was was losing.
[0:58:06.56] Maybe still losing.
[0:58:07.36] I don't know what happened at Twitter, so so the argument that other people say or give in order to to pose this counter argument or the the different viewpoint that's they would be looking at the act of firing itself.
[0:58:24.76] In this case, you're basically letting people go, or you're in this case, you're killing people.
[0:58:31.96] So I think, I think this is interesting because when it comes to business for example, it's all about the consequences usually like it's, I mean the the like fiduciary duty of making the business profitable or if the business is your own still like it's a business except I mean we call it it's business nothing personal.
[0:58:51.84] In wartime it's similar, right.
[0:58:53.36] I mean wars time soldier, your your, your, you know your dispensable right.
[0:58:58.44] It's it's acceptable like the the the consequences.
[0:59:01.6] Winning the war, even though it's more brutal.
[0:59:03.92] But we sort of expect accept them as but see it's a skillful model when it comes to bombing.
[0:59:11.88] Let's say a building where you assume that there are terrorists there but then there are innocent people and that's something that people have debated over the past few weeks.
[0:59:23.68] Yeah.
[0:59:23.88] It becomes wrong for some reason.
[0:59:25.52] Like you know it's.
[0:59:26.44] This is, this is the difference it it's.
[0:59:29.8] Because also it violates A protocol that because it violates a protocol of of WAR behavior, right?
[0:59:40.04] Or violates Or why is it OK to kill some people in certain circumstances but not in others?
[0:59:46.04] Or why is it OK to fire people in certain circumstances and not OK in others?
[0:59:51.76] So the protocols are informed by this?
[0:59:55.24] But what I'm I'm curious as to what you guys think.
[0:59:58.04] Like do you think that?
[1:00:01.12] The ethicality of of or Your decision should be informed by the pros and cons and how, whether it contributes to the greater good for the society or the company or the.
[1:00:16.56] Well, I like with business.
[1:00:17.8] I don't, I don't, I don't see it as a greater, a good problem, right.
[1:00:23.6] It's it's it's about the business.
[1:00:29.8] Well, the greater good of the business.
[1:00:32.48] Me.
[1:00:32.84] Yeah.
[1:00:33.48] You want you want to sustain the company because well, you're the the argument would be that at least 5000 people will keep their job or even though we're sacrificing 1000, but then and we are contributing to the society, we're creating jobs and these people are contributing.
[1:00:49.6] You're bad for the economy?
[1:00:50.72] Then yes, what?
[1:00:51.24] Doesn't look like you.
[1:00:53.16] That's usually the argument.
[1:00:54.16] They have the economy etcetera.
[1:00:56] So you look at at the pros and then the cons are just, you know, whatever.
[1:01:00.4] So it's, it's ethical.
[1:01:03.56] And this is how businesses are run, as as Danielle is saying, and this is usually the utilitarian perspective.
[1:01:10.48] Now we can delve way deeper into the utilitarian perspective, but then the idea here is you don't care about the action, you don't care whether you're firing.
[1:01:19.24] You're lying to people.
[1:01:20.52] You are being dishonest.
[1:01:21.64] You're being, you're not being transparent.
[1:01:24.88] You're not.
[1:01:25.6] You don't have integrity.
[1:01:27.72] This is marketing as well and all these practices right?
[1:01:31.56] You don't care whether you're misattributing or whatever it is.
[1:01:36.2] All you care about is the consequences of the action.
[1:01:38.88] If the consequences of your action bring good, like the if the consequences are outweigh the the cons, the pros outweigh the cons, then it's OK.
[1:01:52.76] So in this case, in the case of Daniel for example, you can just say I'm offering a discount.
[1:01:58.64] Some people are accusing me that it's unethical.
[1:02:01.2] Well, here's my argument.
[1:02:04.76] 200 people joined and benefited from my discount.
[1:02:10.64] I have 4000 members in the community, no one complained.
[1:02:15.6] Everyone is happy.
[1:02:17.92] The pros outweigh the cons.
[1:02:19.68] Therefore, I'm not doing anything ethical, unethical, and this is how the framework works.
[1:02:25.24] So you don't care about because if they accuse you of doing something unethical, look at the action.
[1:02:30.96] What are you trying to accuse me of?
[1:02:32.44] What is the action that you think I'm I'm wrong about?
[1:02:35.88] Are you accusing me of being dishonest?
[1:02:37.84] What manipulation?
[1:02:39.56] Fine.
[1:02:40.04] You think it's manipulation?
[1:02:41.2] Well, my framework is that I'm bringing goodness and happiness the the greatest number of people I can look at.
[1:02:50] Everyone is happy.
[1:02:51] Maybe one or two people are complaining.
[1:02:53.36] Therefore it's good.
[1:02:54.88] I'm not doing anything on ethic.
[1:02:56.88] It's it's interesting because it's it's actually very similar to the arguments I used now and nevertheless I think before you said and which I think complicated it like like if you have a business it's is it OK to no no like is it OK to lie to employees or whatever and and I think that's that makes it a bit.
[1:03:21.2] More difficult to interpret like, because is it OK, like I I, I don't know.
[1:03:26.08] I don't think so.
[1:03:27.36] Like like, does utilitarian, does the utilitarian framework say that you should do or whatever you need to do to maximize the utility?
[1:03:44.76] I mean, there's constraints.
[1:03:45.8] I mean, there are some categories within utilitarianism, but chances are yes.
[1:03:52.16] Like it's it doesn't, it's, it's it's a bit more complicated like than that, right?
[1:03:57.64] It doesn't.
[1:03:58.48] It's it's not.
[1:03:59.32] Do whatever you want, but usually the means.
[1:04:02.68] Or the the end justifies the means.
[1:04:04.4] Yeah.
[1:04:04.64] Yeah.
[1:04:04.76] Yeah.
[1:04:04.96] Yeah.
[1:04:05.44] And that's like that's the implying by the thing.
[1:04:07.72] And then.
[1:04:08.24] Yeah, yeah, yeah, yeah.
[1:04:11.96] This is, this is where this is.
[1:04:13.52] This is usually what economists, economists and businesses.
[1:04:17.56] This is the arguments that they put forward when it comes to discussing the legality of certain issues and but this is this is the thing you you feel at unease when it comes to things like lying.
[1:04:31.36] And what's happening here is your utilitarian framework is conflicting with this other framework, which is the theory of action or theory of duty framework.
[1:04:45.6] One example of the theory of duty within consequentialism and within the theory of duty and within the other framework we're going to be discussing, there are several sub categories and subframeworks.
[1:04:58.12] I'm only using one.
[1:05:00.08] This is just for you to have a sense of how they they evaluate the actions.
[1:05:05.16] When it comes to the theory of duty, you don't care about the consequences, you care about the action.
[1:05:09.96] So in this case, in the case of Danielle, you don't like lying, but on the other hand you're examining the consequences and you think that the consequences are are good, they're contributing to the greater good of your business and people, etcetera.
[1:05:29.32] In this case, you have two conflicting theories.
[1:05:31.16] What would you do?
[1:05:34.88] And the example I gave in the book, which is, which is always fascinating.
[1:05:40.12] We can also apply it to the to the business and lying and not lying.
[1:05:44.36] And I trained to Lisbon, a very interesting novel written by Peter Bieri under the pseudonym Pascal Mercier.
[1:05:50.72] Cheers, Hassan, thanks for joining.
[1:05:54.8] There's this medical doctor living in in Portugal during the time of dictatorship, Salazar dictatorship.
[1:06:03.68] And one day, the right hand of Salazar.
[1:06:07.24] This is fictional, of course Salazar existed, but this butcher of Lisbon, the right hand of Salazar, Jorge Mendez, he's called fictional character, is injured and it's happened right outside his clinic.
[1:06:25.2] So this medical doctor wakes up one day and in front of him lies the body of the guy called Butcher of Lisbon, the head of the Secret Service.
[1:06:37.64] If he doesn't do absolutely anything, the guy will die.
[1:06:43.36] But as a medical doctor, he needs to keep his Hippocratic Oath, do no harm.
[1:06:50.72] Unflinchingly, this guy decides to give him a shot and saves his life.
[1:06:58.36] Of course, all the people in the neighborhood are bothered by what he did.
[1:07:03.6] The news spread.
[1:07:04.84] This guy becomes blacklisted between his people and the Resistance, And so in order to make up for it, he joins the Resistance.
[1:07:13.08] He repents for his sins, but as a medical doctor, So there's the political person in him and there's the medical doctor in him and the medical doctor in him said over my dead body, am I going to let someone die, even if they're, you know, butchers?
[1:07:33.8] I actually asked my my father who was a medical doctor and he was he's, he's retired now.
[1:07:40.48] He was in the Lebanese army and of course he's seen so many shitty things.
[1:07:48.24] And I asked him what would because he they, they did have several encounters with terrorists in Lebanon in addition to to war.
[1:07:55.28] And I asked him what would you do?
[1:07:56.92] Like you there's, there's there's this guy what's the protocol?
[1:08:01.24] I asked about the protocol and the protocol is we save their lives.
[1:08:07.28] Here's a person, and this is who's killing people and is a terrorist and you're saving their life and sending them to the police.
[1:08:19.6] I know this is extreme, but this is it illustrates the theory of duty.
[1:08:25.24] It's like it doesn't matter what the consequence is.
[1:08:27.72] It doesn't matter who the person is.
[1:08:29.4] It doesn't matter what you're doing.
[1:08:31.16] So if if you if your duty is not to lie, you don't lie.
[1:08:35.08] If your duty is to save someone's life, even if it's the worst person on earth, you save their life.
[1:08:43.88] What do you guys think?
[1:08:45.48] So it you look at the action, the action is inherently right or wrong, and there are reasons for why it's inherently right or wrong.
[1:08:54.12] And then your duty is to just do the action.
[1:08:57.04] So just like Daniel said, I I I don't like the lie.
[1:09:01.24] I have a duty, not the lie.
[1:09:02.64] This is where religions come in, right?
[1:09:04.2] The religions say thou shalt not.
[1:09:07.16] You shouldn't.
[1:09:10.64] What do you guys think?
[1:09:13.28] No, I like this duty perspective.
[1:09:15.08] I think it's easier to reason about than the action one.
[1:09:17.96] Like with its solid problem.
[1:09:19.24] I think the action seeing seeing just the action like like like like the the Hypocritic code makes sense.
[1:09:28.96] Like like the doctor it's your duty to to save the person's life.
[1:09:34.2] It's an.
[1:09:34.56] It's.
[1:09:37.8] I like it before.
[1:09:39.88] Before them being terrorists, you're looking at them as humans this.
[1:09:43.28] Is well, I think like what you said, being humans not, it's not super see, it's like it seems like like you said before, like these frameworks can have inconsistencies between them and then I don't know if there's a solution how to deal with the inconsistencies.
[1:09:57] It's a little bit like I think again I like the example you use with with the business case like.
[1:10:03.96] Like does the utilitarian framework, which for your business, you do whatever it takes to make your business succeed.
[1:10:11.48] But then there's like probably some other duty and honor and whatever you call them, integrity, integrity frameworks.
[1:10:19.68] Like you don't lie, like you don't miss, you don't sort of mislead.
[1:10:27.44] And so and so forth at which can be inconsistent.
[1:10:32.52] And actually, like with this framework, you start to recognize some interesting business characters.
[1:10:37.84] Like, for example, like to take an extreme case and rotate, cobblitate, whatever.
[1:10:43.48] That is an interesting fellow.
[1:10:44.56] I disagree with him on many, many things, but but it's actually has some interesting Nuggets of extremely utilitarian perspective with like maximizing from a business side.
[1:10:57.2] But it seems like he's completely lacking any of the other things like.
[1:11:01.92] It's funny like I mean recently I joined one of his products just as a out of curiosity right And all the black patterns you can imagine like with regards to getting you in and not letting you unsubscribe and let you cancel your subscription like it's he implements them all which is almost almost funny right how how extreme it is but it makes you think like it's like like what is he doing like and now now I've attended the session it's the explanation seems to be.
[1:11:32.48] Like a very utilitarian framework, that which seems to be working well for him in terms of the business practicality.
[1:11:41.2] But there isn't the constraints, almost no constraint of you know, it's all the end justifies the means, right?
[1:11:49.36] It's just it's just that right What?
[1:11:52.36] Whether it's lying or misleading or exaggerating hyperbole, right?
[1:11:59.52] Anything I would, I would just highlight that maybe it's not utilitarian, maybe it's consequentialist in this case, so it's.
[1:12:05.72] Maybe yes, the broad and why?
[1:12:08.28] Were you saying this because within consequentialism you have for example egoism, which is you only do what is what's what, what is in your interest.
[1:12:17.08] So maybe this also applies for him.
[1:12:19.72] He doesn't care about other business.
[1:12:22.08] It could be.
[1:12:23.2] It's it's like win, win, win, win, win.
[1:12:25.56] Yeah, yeah.
[1:12:27.04] And again could be like for the business, I know there's his brothers involved and maybe there's other people like it could be just the entity like, but it seems like it's an interesting framework and like this.
[1:12:39.8] Yeah.
[1:12:40] Like there's these inconsistencies which which again like with the with the Doctor and like the political part of the Doctor versus the professional duty of the Doctor being in the in the intention and the by the way, the silver rule is, is the ontological in a sense.
[1:12:59.64] So it's a theory of duty because you have a duty to not do unto others as you don't want them do unto you.
[1:13:08.12] So it's it becomes your duty.
[1:13:11.48] Yeah as well.
[1:13:12.4] So it's it's here.
[1:13:14.2] It's it's interesting to me, you know, I think your evaluation is not based on the consequence.
[1:13:18.28] Your evaluation is based on what action do I not want others to.
[1:13:23.6] Yeah, the, the, the, the do to me.
[1:13:28.88] Like, I think, I think something that can help with these with these conflict side between the two two parties, is that the Viktor Frankel approach of like, would I feel proud about this if somebody were to write my biography and this is how I acted.
[1:13:45.92] And he was going to lie that I lied to my customers or I misled my investors or whatever just to maximize my business.
[1:13:53.08] Would I feel proud if my kids were to read that?
[1:13:56] And I think that's a very good way of you will be dead.
[1:14:00.44] Well, I mean it's, it's what do you mean like you will be dead like from a business side or no you will be by the time.
[1:14:10.28] No, no, it's but it's not even care about it.
[1:14:12.88] I'm not playing devil's advocate here.
[1:14:14.64] But but I think that's why.
[1:14:15.84] That's why using your your children as a as a reference helps.
[1:14:19.92] Because their children they outlast to you.
[1:14:23.08] An interesting segue into the theory of virtue.
[1:14:26.12] Because what you're evaluating here is the character.
[1:14:29.64] Because when you're saying you know.
[1:14:31.08] Yeah, yeah, yeah.
[1:14:31.76] How would?
[1:14:32.52] So you're here.
[1:14:33.16] You're not looking at the action, you're not looking at the consequence.
[1:14:36.2] You're.
[1:14:36.32] Looking at the character, which I think it boils down Viktor Frankel's philosophy, that is like, I want to have a character like everything that I do, my whole philosophy.
[1:14:44.4] Like, even if I die in the concentration camps, which was his story, like I want my character to be one that I'm proud of, which I think, again, this really helps with lots of these things, right?
[1:14:55.6] I mean how you, how you react to people, how you, how you behave, what you do and so on and so forth.
[1:15:03.84] Precisely.
[1:15:04.48] And this is, This is why I think the best illustration about this example.
[1:15:09.12] And mind you again this is, these are very brief kind of summaries of of this just to highlight these frameworks shameless plug.
[1:15:17.64] If you're interested in reading more about them, check the book out.
[1:15:21.12] I I examined them in depth with examples from from businesses etcetera.
[1:15:26.4] But in this case the Groundhog Day like these.
[1:15:29.08] These examples are the ones I I shared at the beginning of each section.
[1:15:36.16] And then I delve deeper.
[1:15:38.8] Groundhog Day, I don't know if if you've seen it, Phil Connors is is a narcissistic person, etcetera.
[1:15:45.32] He's like annoying.
[1:15:46.52] He's he's the epitome of a shitty person and a shitty colleague.
[1:15:52.68] He's he he's, he scorns people.
[1:15:57.4] He's condescending.
[1:15:58.56] He calls himself the talent.
[1:16:00.8] It's like it's typical Twitter behavior.
[1:16:03.08] Typical.
[1:16:04.32] I'm the best.
[1:16:05.08] You guys don't know anything.
[1:16:06.64] I'm I'm the best person, best character.
[1:16:09.44] I I exercise.
[1:16:11.24] I know better than everyone else etcetera, etcetera.
[1:16:14.32] But then one day, Long story short, he's he's covering, he's he's a weatherman.
[1:16:19.68] He's covering the Groundhog Day festivities.
[1:16:22.32] Gets stuck in a loop and after several attempts he realizes there's no way out.
[1:16:28.68] And so, bit by bit, he starts examining himself and his character, right?
[1:16:34.72] And things start changing.
[1:16:36.16] When he starts working on his skills.
[1:16:39.44] He starts developing better relationships with his cameraman and his producer.
[1:16:44.96] He starts helping people.
[1:16:46.76] So Long story short, he becomes a better person, but he becomes a better person.
[1:16:51.8] Not overnight.
[1:16:52.64] He becomes a better person after iterating.
[1:16:55.4] After that, he develops them into a habit.
[1:16:59.16] So he cultivates these habits of good character traits and eventually he snaps out of the loop.
[1:17:09.28] But then he's a better person now.
[1:17:12] And so the entire kind of morale of the story is that, well, a self examination to try to look at our habits and to improve ourselves.
[1:17:24.6] And this is what virtue theory is about.
[1:17:28.24] Virtual theory was proposed by Aristotle.
[1:17:31.32] So ask yourself these questions.
[1:17:33] You're not looking at your actions, you're not looking at the consequences.
[1:17:36.96] You're looking at what kind of character you want to be.
[1:17:39.4] And this is what Daniel was saying with when it comes to how your kids view you, how you treat others, etcetera, what kind of person do you want to be?
[1:17:47.76] What kind of solopreneur do you want to be?
[1:17:49.52] Do you want to be the sleazy solopreneur who's always I'm a high selling ticket, I don't know what I'm going to help you etcetera, etcetera and and and stuff like that and whatever.
[1:18:02.2] What character traits and habits do you want to cultivate?
[1:18:05.72] Would you like to be an honest person etcetera.
[1:18:07.76] So you look at character traits we would like to cultivate.
[1:18:13.8] It's a bit more elaborate than that, but then the assumption is that by cultivating good characters, we are not giving people fish, we are teaching them how to fish.
[1:18:28.52] So you're not telling them how to act, you're not telling them what they should do.
[1:18:33.24] You're basically equipping them with the right habits so that they can figure out on their own what the right thing would be to do.
[1:18:42.16] So by cultivating good character traits like being honest, like being temperate, like being wise, like being courageous, like being just like being honest, by having integrity, all these character traits that we we, we, we uphold and we recognize as good, we would have a decently functioning society.
[1:19:10.2] Of course, scale matters.
[1:19:13.24] Like, it's not the same if you're teaching, like if you're living in 150,000 community than if you're in a country, right?
[1:19:21.84] Like with with 5 million people can always be and they're also inexhaustible.
[1:19:27.64] Exactly.
[1:19:28.44] It's it's that you can you can never be generous.
[1:19:32.72] It's an ongoing practice.
[1:19:34] You're developing the habit you're you're trying to.
[1:19:37.24] Thanks, Shane.
[1:19:37.96] Cheers.
[1:19:39.48] So this is, this is the underlying kind of framework of virtue ethics.
[1:19:44.52] I can talk a lot more about it.
[1:19:46.44] But again, like what?
[1:19:48.08] What character traits do we want?
[1:19:49.96] This is, it doesn't matter because it's we can, we can talk about them on a different occasion.
[1:19:57.84] But the idea here is, do I want to be fit?
[1:20:00.44] Do I want to be healthy?
[1:20:01.48] Do I want to be to treat people with respect?
[1:20:03.56] Do I want to be honest?
[1:20:04.76] Do I want to keep my word, Do I want to blah blah blah, right.
[1:20:08.32] So in this case, just like Daniel was saying, what I'm.
[1:20:11.92] What I want I don't care about how I treat my customers as such, or about the consequences, or about the action, etcetera.
[1:20:18.76] I want to think about my character, how people will judge my character.
[1:20:24.56] And This is why I said Naseem has a combination of theory of duty versus virtual theory.
[1:20:34] He obsesses over character and over actions.
[1:20:37.64] And this is where problems become even more apparent because you're rigid and you're obsessive over character.
[1:20:47.44] So it's like I don't want to lie and if you lie once then you're a bad person you know and this is it becomes problematic.
[1:20:55.68] It's like, but if like you don't become courageous by doing 1 courageous deed, you don't become generous by just donating once, right?
[1:21:04.12] And you don't become a bad person by lying to your customers once.
[1:21:09.56] It's like it has to be just like a medical doctor.
[1:21:12.24] You're not a bad medical doctor if you mess up once you're a bad medical doctor if you have a track record of killing people.
[1:21:22.24] So yeah, this is, this is basically it.
[1:21:24.32] Consequences, action and character.
[1:21:27.52] These are just examples of firm to pay the $9.7 million of our gifts because we were taking traders and you know all sorts of SEC is there because it's unethical.
[1:21:39.84] They have all sorts of fines because of unethical behaviors, overstatement of reserves, earnings manipulation, coding for government reimbursements, Medicare fraud.
[1:21:55.2] And this is what Sasha was asking.
[1:21:58.96] So this Kevorkian medical doctor, it's it's a very famous story.
[1:22:03.76] I think he, the guy is Armenian, probably also Lebanese.
[1:22:08.4] He was helping people die.
[1:22:12.12] They they wanted.
[1:22:13.72] So people who were terminally ill, it's illegal in the US It was, I think it still is.
[1:22:22.28] And this guy was helping them die, like they wanted to just end their life and it's illegal and he was doing it.
[1:22:30.04] Some people look at him as an ethical person because he's just fulfilling his patients wish.
[1:22:40.08] If you disagree that, that's a different story, right?
[1:22:42.28] But then it's it's one example where this guy was serving the community, people who were terminally ill but who were in their right state of mind.
[1:22:51.32] They were sharp.
[1:22:52.44] So it's not that he was killing people.
[1:22:54.24] No.
[1:22:54.68] And everything was documented, kind of under the table, but documented.
[1:23:01.36] So I'll leave that up to you.
[1:23:04.88] It's just one interesting example.
[1:23:09.16] And then Stanford professor resigns integrity of integrity of his research.
[1:23:15.96] Harvard professor who studies honesty falsifies their data and go figure.
[1:23:21.88] But they're not going to go to jail for that.
[1:23:26.76] They just resign.
[1:23:27.68] Or they consider resigning.
[1:23:29.88] So these things are not illegal or they're not criminal, but they're unethical, based on character, based on action, based on consequence.
[1:23:44.08] I'll I'll also leave that up to you, just by way of summing things up.
[1:23:49.32] Responsibility.
[1:23:50.16] We talk about fairness, we talk about honesty.
[1:23:52.16] We talk about value as well.
[1:23:53.64] Someone mentioned value, conscience, choice across domains, corporate workplace sustainability, biodiversity, tech because we mentioned LLMAI, cybersecurity.
[1:24:07.52] Now with self driving cars, would you kill the driver, would you kill the person, would you, who would you kill?
[1:24:14.04] Etcetera.
[1:24:14.44] So all these, you know, different kind of domains where ethics is rampant and these are the steps in order to help us identify different ethical dilemmas, to make decisions, recognize what the dilemma is like, Why is it a dilemma for you?
[1:24:39.48] Is it just like because you're thinking just like Daniel was saying, I don't want to lie, but then I want to keep the business, but then my character.
[1:24:47.36] Are there other frameworks that maybe you are implementing and you want to figure out?
[1:24:52.72] Gather relevant information Because many people say you know when it comes to legalizing or criminalizing marijuana, marijuana is is harmful because it's OK.
[1:25:03.6] But do you have the information?
[1:25:06.72] Where?
[1:25:07.28] Where are you getting your information from?
[1:25:09.32] And chances are you will find data for and against the same thing.
[1:25:15.88] And it's like everyone is claiming that it's good and everyone is claiming that it's bad.
[1:25:19.68] Just by way of example, think about which decision framework or ethical framework you want to be implementing.
[1:25:29.48] Make the decision, take action, but then reflect on the decision decision.
[1:25:33.36] So in this case, Danielle or anyone else, you have an ethical dilemma.
[1:25:40.6] It's OK to make a decision.
[1:25:42.24] And so far as the decision is not to kill someone like and so far as you're not harming anyone to to that extreme, it's always up for thinking.
[1:25:55.08] And in this case, coming back to the trolley problem, I don't know if with these frameworks now you can think about it in a different way.
[1:26:04.6] Have things become clearer because all the arguments, legal and otherwise social, religious, they all are informed by at the bottom of it, at their foundation, frameworks that are ethical consequences, action, character, just justice theory right theory, right to hold guns because it's, you know, right to healthcare, right to whatever.
[1:26:39.76] I did not discuss rights because everyone we have it in our mind.
[1:26:43.64] Why should we have the right to freedom of speech?
[1:26:48.36] Why should we have the right to etcetera, you know, so all these frameworks instead of just saying, well, it doesn't feel right that this is the case.
[1:26:57.84] Now I hope that you're just equipped with this consequentialism.
[1:27:03.24] What would be the action that results in the greatest overall goodness action?
[1:27:08.6] What is my duty in this situation?
[1:27:10.72] Which action is right?
[1:27:12.52] And virtual ethics?
[1:27:13.32] What would align with my values as a leader or as a person?
[1:27:18.8] If you hate spam, spam emails, you get the gist.
[1:27:24.6] So that's just by way of wrapping it up, I hope.
[1:27:30.64] I hope that we got to a point whereby you have a clearer awareness of these frameworks and you're capable of examining your assumptions and adopting a different kind of mindset when discussing these issues.
[1:27:47.12] Not on Twitter, because it doesn't help, but you know and thank you for tuning in.
[1:27:53.2] I hope it was a informative talk.
[1:28:02.44] No, no it was.
[1:28:03.48] I didn't like this this this the take away.
[1:28:06.56] So it was actually they are really helpful in in.
[1:28:11.2] Deciphering the some of the dilemma.
[1:28:13.16] So.
[1:28:14.32] So it's good.
[1:28:14.84] I was curious actually related to the example of the euthanasia case right, that you showed us because in that case still I couldn't actually map it to those three things like because isn't there something else there that it seems unethical.
[1:28:32.8] I mean according to some people that taking away a life like intervening to.
[1:28:41.12] Assists.
[1:28:41.64] Suicide is unethical more for spiritual reasons just because a human shouldn't, which I think is somewhat related to the abortion debate as well.
[1:28:50.68] I know it's a very sensitive topic, right?
[1:28:52.2] But there's again like it's like saying the fetus is just the unborn clamp of cells versus seeing it as something more valuable.
[1:29:04.76] Does it map to those three?
[1:29:05.96] Like which one does it map to?
[1:29:07.28] Or how does it map to those three?
[1:29:09.24] The consequences?
[1:29:10.12] Virtue.
[1:29:10.6] Action it.
[1:29:14.4] It maps.
[1:29:14.88] Maybe it's the action part.
[1:29:16] It's the action part.
[1:29:16.76] Like the the lack of action.
[1:29:18.24] Again, like the like.
[1:29:19.76] The doctor shouldn't assist suicide or assist abortion, weighting it from a consequentialist perspective, you're you're basically thinking in terms of if the person is terminally ill.
[1:29:34.64] I can give the argument that it's much it will alleviate the pain for them and their parents, economic and otherwise.
[1:29:43.44] Of course, if everyone is on board, right.
[1:29:45.52] So this is where the context also becomes important because if if, if there it entails high costs and etcetera etcetera, there could be an argument made in terms of or from an A consequentialist perspective to say it's they are much better off dying because you're the amount of pain.
[1:30:11.32] This is where we we get the the what what is called like the calculus when it comes to utility and pain and pleasure.
[1:30:18.76] Because your ultimate goal is to create the try to bring the greatest happiness or pleasure to the greatest number of people.
[1:30:32.52] So in this case the pain may be excruciating, psychological, physical, etcetera.
[1:30:39.04] That alleviating someone from that will be a relief.
[1:30:43.64] This is from a consequentialist perspective.
[1:30:45.92] Another perspective would Kant would say wrong flat out wrong because it goes against the principle of life according to Kant.
[1:30:53.28] And Kant was a theory of action beauty guy.
[1:30:58.16] So it's it's your duty not commit suicide.
[1:31:02.08] But this is not suicide and it's it's it's but again there's also the there are arguments that you scant against can't and say it's OK.
[1:31:12.48] So it can be put there as well, because the principle of life, some people argue, is to be able to live well.
[1:31:20.72] And if that principle is kind of is, what would the word be if if that principle is not being fulfilled, if that principle fails, Like if you're if you're in pain and you're terminally ill, you're not living.
[1:31:36.96] And so therefore it may be OK in that sense from an action theory, duty theory, so it can be evaluated from all these, and then the rights theory, I have the right to decide.
[1:31:52.84] This is where the abortion comes in.
[1:31:56.4] It's my body.
[1:31:57.4] I decide it's my right.
[1:32:00.24] But then we get the discussion with the baby and it's a life or not.
[1:32:03.84] But let's not get there.
[1:32:05.08] But this is, this is where the framework is coming from.
[1:32:07.6] If I am in the right state of mind and I can decide for myself, I have the autonomy and the freedom to decide for myself what I want to do with myself.
[1:32:23.12] And this can also be and you have the duty to respect my authority if it does not infringe on other people.
[1:32:33.32] But then gets the discussion of what about the family what about what about what about it's it's complicated and that's why ethical committees it's like it's not easy to be on an ethical committee examining these things like in Switzerland it's it's there's a there's a an association called Dignitas where you go there they do examinations you decide to end your life.
[1:32:54.08] That's it.
[1:32:55.08] There's a documentary about it.
[1:32:56.28] We discussed this in Bioethics.
[1:32:58.52] There's an Italian DJ who got paralyzed by a car accident.
[1:33:03.6] His friend.
[1:33:04.88] It's illegal to do that in Italy.
[1:33:07.48] His friend drove him to Switzerland.
[1:33:10.68] He died there, came back.
[1:33:13.2] They put him in prison because he was an accomplice in a crime according to the Italian law.
[1:33:21.96] I don't know what happened afterwards, but just by way of example, so it can be.
[1:33:29.32] I can argue for it from all these perspectives.
[1:33:32.84] Long story short, very.
[1:33:36.16] Interesting, but it's it's difficult and let me tell you, like with bioethics, I don't like teaching bioethics.
[1:33:43.84] It's it's completely problematic.
[1:33:48.28] But when people have an open mind when it comes to these things, I show them the documentary before the documentary where they go to Dignitas and they decide to end their life before the document documentary.
[1:34:02.28] And we're talking a very religious country in Lebanon.
[1:34:05.48] When back when I was at university, I would ask how many of you is OK with that?
[1:34:09.8] Everyone would raise their hand against it.
[1:34:12] It's like they're not OK with mercy killing.
[1:34:15.2] After the documentary, I would say 50% start reconsidering and I don't know like I don't.
[1:34:25.72] I have no opinion on it.
[1:34:26.92] It's I, I really don't know.
[1:34:28.44] It's a difficult topic.
[1:34:30.56] I I don't ever know.
[1:34:32.84] I don't.
[1:34:33.52] I have no stance on on Mercy killing.
[1:34:36.16] But it's interesting to just look at it from different perspectives and once you do, you're like, it's complicated.
[1:34:44.84] It's complicated, but yeah.
[1:34:46.88] Would you respect this person's decision?
[1:34:49.44] Autonomy, Or would you say human dignity?
[1:34:54.96] But then human dignity is to respect their decision?
[1:34:58.36] It's like otherwise you're being patronizing or paternalizing.
[1:35:03.56] It's like you know better about what they're going through than yourself, Viktor Frankel would tell you.
[1:35:09.04] Suck it up.
[1:35:10.08] Find meaning in your suffering.
[1:35:13.36] Yeah, yeah.
[1:35:15.48] Different.
[1:35:16.52] So The thing is it's I don't think the problem is the conclusion.
[1:35:21.84] Again I think the problem is judging the conclusion as someone being bad or like jumping to character kind of labeling by when people are using frameworks that makes sense.
[1:35:37.28] So you disagree.
[1:35:38.12] I but that's why also doing policy is very difficult when it comes to this, you know, as I but like based on these arguments, should you legalize or should you not?
[1:35:50.32] Should you let people or should you not?
[1:35:53.36] It's difficult.
[1:35:54.04] It's difficult And then it goes on Twitter and people start fighting over it because of emotions and stuff.
[1:36:00.52] Hold on, take a step back.
[1:36:02.72] There's a lot more into it than than just let everybody No, no.
[1:36:08.08] It doesn't work that way.
[1:36:08.92] It's it's complicated.
[1:36:10.32] It's it's it's really complicated.
[1:36:13.56] Yeah, it's it's tough.
[1:36:16.44] Similarly now with AI and who gets the responsibility if if someone fucks up or who?
[1:36:21.92] Who gets Who goes to jail if you're a if your car goes berserk?
[1:36:29.08] Yeah, we blame it on the algorithm.
[1:36:32.24] So are we saying nobody's fault accidents?
[1:36:38.04] Happen.
[1:36:38.64] Act of God, basically, yeah.
[1:36:42.08] Yeah.
[1:36:43.32] Absolutely.
[1:36:43.68] Maybe I have a business idea for you.
[1:36:45.56] You can maybe if you think of, because I think the interesting part was that you really pointed to this fact that we have all these implicit ethical frameworks that we're not aware of usually.
[1:36:55.28] But when we look at these examples, suddenly suddenly we discover that.
[1:36:59.44] Or we understand it better.
[1:37:00.32] Maybe you could develop an AI or something that like analyzes everyone's frameworks and then puts it in context somehow.
[1:37:08.44] And then yeah, gives gives a more informed frame of of reference for for this decision.
[1:37:14.4] That's OK.
[1:37:16.12] Maybe this could this could settle a lot of online debates.
[1:37:18.36] Also this if everyone suddenly becomes aware, oh fuck, these are my premises, I'm coming from there.
[1:37:23.24] I didn't even know that.
[1:37:24.64] This is why I I wrote this book again, and this is now.
[1:37:27.68] It's not a plug, it's this.
[1:37:30.48] Is I bought it already but thank you.
[1:37:32.4] Thank you very much.
[1:37:33.04] But it's it's because do you really think and that's why ethics is at the end because the first chap, the first part is about awareness.
[1:37:41.36] You mentioned this and I think I'm not so sure that if you have something pointing out the frameworks that people use that they will automatically become more perceptive and receptive.
[1:37:53.64] I think many people need to 1st be develop awareness and question their assumptions first and if they're not willing and that's why ethics is very difficult and that's why I I'm I'm interested in knowing why like you know it's why people find it because did I say anything controversial here?
[1:38:14.64] It's like it's it's interesting.
[1:38:16.24] I'm not I'm not mentioning my talk.
[1:38:18.16] I'm mentioning the topics like the the frameworks the how they are approached the arguments.
[1:38:23.72] I think it's very it's it's important but then people don't.
[1:38:29.64] I'm I'm skeptical about the fact that people want to know what their frameworks are or want to be aware.
[1:38:37.64] What do you guys think?
[1:38:38.48] I I completely I like.
[1:38:40.48] I like the idea like maybe I I will think about that and I.
[1:38:46.8] I think just just this kind of becoming aware that there are framework, there are different ways to to approach ethics.
[1:38:53.88] I think 90% of people don't think of it about that, and just becoming a record could trigger something like reflection, at least in some people.
[1:39:04.32] How would you approach it in a way such that it wouldn't become like an IQ test but something more of that opens debate whereby people not they wouldn't just say oh I'm a liberal right wing let's say or I'm a conservative right wing or you know this there's a test you take where that puts you on different coordinates with right left and authoritarian lacks how this is this is my struggle with with philosophy like this requires a lot of back and forth and and discussion and conversation.
[1:39:42.68] How do you think it's possible.
[1:39:44.24] Yeah, if if, if you, if you have an idea, do let me know because this will save me a lot of time and trouble because I that's why I don't do asynchronous courses, because if I talk about these frameworks it's like I feel like it's it's not enough you need the.
[1:40:10.36] Yeah, so if if you solve that I I really like the idea and I think I've been thinking with something similar, but then I always go back to the same point, it needs a back and forth and how can we solve that problem?
[1:40:23.72] Seems like the AI has that opportunity.
[1:40:25.6] Actually, that's.
[1:40:26.12] Exactly, Yeah.
[1:40:26.72] That's what I was thinking of AI also because you have this this kind of dialogue situation, you can.
[1:40:32.4] Because now, now, now, nowadays, like.
[1:40:34.84] Predict dialogue a little bit.
[1:40:37.08] Exactly like you could like feed the AEI the contents of the book basically, right?
[1:40:42.72] And make like make let people come up with their ethical dilemmas and let the AEI explain based on the patterns because it's what AI is, is good at is like recognizing these patterns, right?
[1:40:59.28] I mean, they're based on language, but usually a bit midday map to the other things.
[1:41:02.84] And of course it will.
[1:41:04.4] It will get things wrong occasionally, but.
[1:41:08.6] But I think there's something 's wrong if if depending on the shitty and.
[1:41:12.48] Fun Well, not necessarily.
[1:41:14.56] Like, because it's all probabilistic, like it might misinterpret or whatever.
[1:41:18.92] So there's always a degree, there's always a degree of other.
[1:41:22.92] But it's surprisingly good as we see that even even even our funny bot light.
[1:41:28.04] It's actually surprisingly sound.
[1:41:35.32] It's it's in this guesswork because like like like such as saying like you can have the back and forth with the AI and define the questions the the AI tells you something and then say but what about this other limitation to do that I'll I'll give you the book No honestly because I like I don't have the technical other than uploading it to GPT and doing GPTS.
[1:42:05.28] It's a first step, like literally you could, you could summarize it, maybe not upload the whole book like say, and just tell GPT like literally in the window, like, here's what I think and then give it some examples.
[1:42:18.4] And then of course there's more advanced ways to do this, like.
[1:42:21.16] But it could be an interesting first step to see if there's something there.
[1:42:25.96] Yeah, yeah, I And that's that's also another issue.
[1:42:29.76] It's there's a an, a responsibility there.
[1:42:35.6] I Yeah, like in this case, who?
[1:42:39.16] Who gets blamed?
[1:42:41.12] If no, I no, no it's I yeah I mean it's this should be this should be presented as something to make you think like not not like just something to make you reflect.
[1:42:52.24] It's it's a it's a stress stress tester.
[1:42:54.64] Like it says something just to plump to you, to reflect, of course.
[1:43:00] I think, I think any again like any reasonable person is not going to like fire the employees or do something crazy based on what you say, but.
[1:43:10.8] I mean, they might think, yeah, exactly.
[1:43:12.8] This is interesting because you're assuming that they're reasonable people.
[1:43:15.76] But then, and I'm not talking about the GPT here, I'm talking about in general.
[1:43:19.8] Like, but this could be the same thing.
[1:43:21.16] But look, it could be the same thing with the book.
[1:43:23] Like some people might read your book and say, oh, because of what I said, this is actually what Naseem had a big issue with, like when people were using the Fragile for oh, you should just let all the grandmas die and this population is going to become stronger.
[1:43:38.24] Which?
[1:43:38.52] Which?
[1:43:39.4] Irked him light for this soul.
[1:43:41.48] Because it's.
[1:43:42.64] It's not what he meant.
[1:43:45.48] Yeah, Which is Which is.
[1:43:48.36] Probably but but you again like it's it's it's it's an interesting idea what's interesting with with LL lamps nowadays it's like and this is something we recognize with the small but because the small but like we literally told it, I mean it's a bit more sophisticated like the ply.
[1:44:06.12] As if Nasim Talib was applying to an angry like to an annoying speaker on Twitter.
[1:44:11.68] And it gets his almost his model things right, right?
[1:44:17.2] Because it impersonates him.
[1:44:18.52] Really.
[1:44:19] We tweaked him a bit over time, right?
[1:44:20.96] So it's not like, But that's how it started.
[1:44:23.44] So with just that prompt, it would just presumably it had read all of Nasim's tweets and whatever it figured out.
[1:44:34.56] To to a reasonable degree, right?
[1:44:36.72] What he stands for and how he reacts to things and so on and so forth.
[1:44:40.76] So I think with probably even more explicit training or whatever technically it's called, that you can even make it more interesting.
[1:44:52.88] Yeah, yeah, it's it's definitely interesting because that's, I mean the reason why I wrote this book is because I'm I'm exploring like the based on my on on the workshops and classes I I gave and and all that jazz.
[1:45:12.08] Forget ethics.
[1:45:13.12] Like there there are these broad categories.
[1:45:16.28] One of them is ethics.
[1:45:17.76] But ironically, ethics becomes interesting after the fact, and that's why I don't mention it anyway.
[1:45:24.52] Like, once you hit the wall, then you start thinking, OK, maybe, maybe there's something to it.
[1:45:30.52] But right from the start, and I have that with many, many people I know who've taken my courses, they're like ethics is pointless.
[1:45:40] If it's legal, it's OK, yes.
[1:45:42.2] But then we're not only talking about what is legal and what is illegal, We're talking about other things, How you treat others, how you go about life, your values, everything.
[1:45:54.88] Because these very people are also very ethical, like very ethically oriented in their lives.
[1:46:01.56] They just don't know it.
[1:46:02.48] And then I start giving examples and doing all these discussions and they start thinking, oh, OK, and that's one thing I need to start doing from now on.
[1:46:13.92] There's for those who are who disagree or who think that ethics is, it's just all legality.
[1:46:20] There's a descriptive and there's a prescriptive part.
[1:46:23.2] One thing is to say that there is no ethics in the business world today.
[1:46:27.76] But another is to say, but we ought to improve things because I start giving examples and they're like, oh, this is wrong.
[1:46:36.44] Oh, this is wrong.
[1:46:37.12] Oh this is right.
[1:46:37.88] Oh I you know so that there are no ethics nowadays in the business world because everyone is trying to take advantage of everyone else does not mean that this is how it should be.
[1:46:52.52] But that's a different story.
[1:46:53.8] So here everyone seemed to be on board so it was way easier for me to cruise through it.
[1:47:03.2] But but yeah this is this is an idea.
[1:47:05.72] I'll I'll think about it like because again I'm I'm trying to scale this up a bit in a way that does not depend on me being there and talking and talking and talking.
[1:47:17.88] So so yeah thanks for the idea I'll.
[1:47:20.84] Maybe we'll try it with small bots.
[1:47:22.32] We'll try to make him, yeah, of these things.
[1:47:27.52] Actually, yeah, like Louis, I might get you in touch with Louis too, because he's good at this stuff.
[1:47:32.92] To give it a shot, because you can start experimenting and see gut feel how it looks like that we can probably.
[1:47:40.08] Create you like another bot like the Mahmoud bot.
[1:47:43] The ethics No.
[1:47:45.8] Decaf bot, yeah.
[1:47:48.12] Self Reflecting Talent Bot.
[1:47:50.24] Yeah, yeah, yeah.
[1:47:52.08] And and then put them.
[1:47:54.04] Talk to each other that that would be that would be interesting.
[1:47:58.84] But yeah, I think so.
[1:47:59.8] I'm.
[1:48:00.04] I'm also starting at this Again, this is where the the book came like that's that's what I was thinking about.
[1:48:04.88] So like when it comes to philosophy, it's not what this person or that person said and that's why I don't discuss utilitarianism here and I didn't.
[1:48:11.52] I just talked about the frameworks.
[1:48:13.92] Broadly speaking.
[1:48:15.36] There's a lot more to them blah blah blah.
[1:48:18.04] But that's what it does.
[1:48:21.92] Discussing, having like awareness is important.
[1:48:24.8] Questioning our assumptions is important.
[1:48:27.08] And it doesn't mean that you should agree with other people.
[1:48:29.32] It doesn't mean that you should adopt other people's viewpoint.
[1:48:33.48] It's just understanding that we have different viewpoints.
[1:48:37.16] And this is the starting point, developing awareness to know where you stand and then asking questions and this leads you to uncertainty and risk and then meaning.
[1:48:46.16] And then towards the end the moral responsibility.
[1:48:50.08] And I'll I have to share this quote from from Viktor Frankel, because it's amazing given that you mentioned Frankel a lot.
[1:49:00.84] And it's funny because he for him ethics comes almost towards the end.
[1:49:07.96] In fact, freedom is in danger of degenerating and to mere arbitrariness unless it is lived in terms of responsibility.
[1:49:15.76] That is why I recommend that the Statue of Liberty on the East Coast be supplemented by a statue of responsibility on the West Coast.
[1:49:24.32] And that's I think what is usually forgotten or not accounted for in these discussions.
[1:49:30.56] Like in general, it's like freedom and we want to do whatever we want.
[1:49:34.56] And yeah, but then there's responsibility.
[1:49:40.28] So yeah, yeah.
[1:49:43.64] Any any other questions, thoughts, impressions you'd like to share?
[1:49:49.16] There was a recent paper how they basically all GPT do so kind of double edged sword.
[1:49:56.12] Yeah, because we were discussing the in the chat like somebody was prompted to commit suicide.
[1:50:04.88] If I understood collect which which one of the dangers of GPT And just on the side because I I given that we mentioned that someone made me aware of that.
[1:50:17.72] Apparently now they don't say commit suicide because it implies that the person committed the crime.
[1:50:26.16] They say attempted suicide.
[1:50:28.84] And so this is it's it's interesting also the ethical dimension of the words that we use when it comes to how we refer to these things as like labeling them as either good or bad implicitly, even when we are not trying to do that.
[1:50:43.92] Right.
[1:50:44.12] Like I said, commit suicide is just a that's a verb that we do And it's.
[1:50:48.48] So I asked her what, like, I'm, I don't know what you're talking about because I've never thought about it like I'm you know and then she was like because she works with she used to work with Red Cross and with helpline.
[1:51:05.24] The what do they call them?
[1:51:06.64] Suicide Helpline.
[1:51:09.04] I don't know what they call them, but you know, and then you know this is where they teach them.
[1:51:13.36] Like, yeah, it's not commit you're not trying to commit.
[1:51:17.48] But anyway, that's that's on the side.
[1:51:20.44] Yeah, OK.
[1:51:21.16] It's it's complicated.
[1:51:25.44] Thanks Mahmoud.
[1:51:26.16] This was very interesting.
[1:51:27.8] I took something interesting from it.
[1:51:29.68] Good to see.
[1:51:30.28] Yeah.
[1:51:31.28] This is 1 discussion and I was, I was really glad.
[1:51:34.96] And again the the the the question is not for attendance.
[1:51:38.52] The question is for sign ups.
[1:51:39.96] Yeah, yeah.
[1:51:40.48] I'm really curious because.
[1:51:42] Yeah, I'll if it's the ethics or business ethics, then the title could be, as Hassan said, something I don't know.
[1:51:54.28] This is the thing.
[1:51:55.52] Forget it's ethical.
[1:51:58.2] To find some clickbait.
[1:52:00.96] But I don't like that.
[1:52:02.04] That's why this is my problem.
[1:52:04.16] It's like I wanted to call it something else but then I was like, I cannot.
[1:52:09.08] And Hassan for example had a different subtitle how to solve problems etcetera.
[1:52:13.12] That's it's not it's not me.
[1:52:14.8] I I felt bad because I'm not doing that in the book.
[1:52:17.76] I'm doing something else.
[1:52:19.6] It's it's not for my lake.
[1:52:21.44] I don't even know what I'm doing in the book but it's it's so it's it's an exploration of of topics but yeah and again that's that's me.
[1:52:28.96] It's is it character is it action is it.
[1:52:31.2] I don't know what it is.
[1:52:32.24] It's I have to be straightforward.
[1:52:34.76] Even if that means I'll be a bridge dweller and sacrifice opportunities to become a multi millionaire.
[1:52:46.32] Now the ethical people would tell you if you persevere you will eventually become.
[1:52:50.4] I don't know.
[1:52:50.92] I'm not so sure about that.
[1:52:51.84] I read another tweet that they are convinced that you cannot become a billionaire if you're unethical.
[1:52:57.76] Some people hate the rich because it's unethic.
[1:52:59.56] But I'll, I'll stop.
[1:53:00.64] I'll.
[1:53:00.88] I can rest forever.
[1:53:03.32] OK, Mahmoud, don't care for everyone maybe.
[1:53:05.24] Thank you very much for doing and hope you enjoy.
[1:53:09.12] Your discussion.
[1:53:10.12] Have a good day.
[1:53:10.64] Thanks, Sasha.
[1:53:11.16] Thanks a lot, Dennis Mahmoud.
[1:53:12.92] Bye, bye.
[1:53:13.56] Thanks everyone.
[1:53:14.72] Cheers.