Skip to content
On this page

Londoniyyah - Part 8 - Consequentialism | Mohammad Hijab (2021-11-25)

Description

Londoniyyah - Part 8 - Consequentialism | Mohammad Hijab

To be updated about our content please subscribe and open the notifications.

BOOK A LIGHTHOUSE MENTOR

Are you or someone you know doubting Islam? Do you find yourself struggling to find answers? Do you have a hard time speaking to someone about Islam? Are you considering Islam but are unsure about certain concepts? Are you an activist, Imam or community leader who is unsure about how to handle questions related to science, philosophy, the Islamic moral code, etc.?

You are not alone. Over the course of the last decade or more there has been a rapid proliferation of content online and in academic institutions that has eroded the faith of some people.

Seeing the rise of this phenomenon , Sapience Institute is introducing a One to One mentoring service called LIGHTHOUSE.

BOOK A MENTOR HERE: https://sapienceinstitute.org/lighthouse/

VISIT our website for articles in English, Spanish and Turkish; mentoring service, learning platform and for speaker requests: https://sapienceinstitute.org/

Summary of Londoniyyah - Part 8 - Consequentialism | Mohammad Hijab

*This summary is AI generated - there may be inaccuracies.

00:00:00 - 00:40:00

Consequentialism is a philosophical theory that holds that the right action is the one that has the best consequences. Mohammad Hijab discusses the merits and demerits of consequentialism. He points out that consequentialism solves some of the problems of deontological ethics, but that it can also lead to wrong decisions.

00:00:00 In this eighth video in a series, Mohammad Hijab discusses consequentialism. Consequentialism holds that the right action is the one that has the best consequences, and utilitarianism is a form of consequentialism. Deontological ethics, the opposite of consequentialism, says that actions have intrinsic value.

  • 00:05:00 Consequentialism, a philosophical theory that justifies the means by considering the consequences of an action, can replace desirability with anything else. There are different types of consequentialism, each with a different premise, and each with its own arguments.
  • 00:10:00 Mohammad Hijab discusses the merits and demerits of consequentialism. He points out that consequentialism solves some of the problems of deontological ethics--namely, that you can never lie, and that the consequences of our actions always matter. He also discusses how consequentialism is based in the future, and how one can use the past to make decisions about the present. Finally, he discusses how consequentialism can lead to wrong decisions, and how it requires people to weigh probabilities and consequences.
  • 00:15:00 Mohammad Hijab discusses the justification for consequentialism, which is that any step in the right direction is justified. He points out that there is a difference between desirability and desire, and that the naturalistic fallacy can occur when trying to justify something based on its desirability alone. David Hume argued that it is desirable because we desire it, which is a form of the naturalistic fallacy.
  • *00:20:00 Discusses the philosophical concept of consequentialism, which holds that the consequences of an action are the only legitimate measure of its morality. David Hume argued that if something is based on a naturalistic fallacy, then it cannot be morally good. Sam Harris, a prominent ethicist, has argued that it is morally acceptable to kill disabled children because the pain is more than the pleasure they experience. Peter Singer, another utilitarian, has said that it is morally equivalent to killing a child thousands of miles away. Anthony Joshua, a professional boxer, has said that he believes in full consequentialism, which holds that the consequences of an action are the only legitimate measure of its morality.
  • 00:25:00 Mohammad Hijab discusses consequentialism and its flaws. He points out that, because of the principle of intention, some actions (such as murder) are treated the same regardless of the consequences. He also discusses the Islamic Sharia system, which is based on consequentialism.
  • 00:30:00 presents three principles that consequentialists should consider in order to make sound decisions: that the action must have a significant impact on the whole Muslim community, that it must be certain to happen, and that it must be a necessary action in order to protect important aspects of Islam. He provides an example from Islamic history of a situation in which it was justified to use human shields.
  • 00:35:00 Mohammad Hijab discusses the principle of islamic consequentialism, which states that actions must be taken in order to achieve the best possible outcome for as many people as possible. He discusses the case of Aisha, who was nine years old when Prophet Muhammad married her, and points out that there is both scholarly and moral justification for this marriage. He argues that, in order to be consequentialist, one must first demonstrate that harm will result from the action in question. If this is shown to be the case, then the action is justified.
  • 00:40:00 explains the principle of consequentialism and argues that because it is context-specific, it is not universally applicable. They go on to discuss the history of feminism and how it changed due to education. They conclude by saying that, in today's age, it is important to be aware of the different approaches to ethics and to be able to differentiate between them.

Full transcript with timestamps: CLICK TO EXPAND

0:00:13 to the eighth session of this course
0:00:16 today we're going to be talking about
0:00:17 consequentialism
0:00:18 and we've already spoken about
0:00:19 utilitarianism which is a form of
0:00:22 consequentialism and we're going to be
0:00:23 speaking about how that is so
0:00:25 and in the previous session we spoke
0:00:26 about deontological ethics which in many
0:00:28 ways is the opposite of consequentialism
0:00:31 or the main competitor to
0:00:33 consequentialism
0:00:34 so before we start we're going to read
0:00:36 as per tradition
0:00:38 the parts of the poems over the poem
0:00:40 which is applicable here
0:00:42 first luffy
0:01:03 right so consequentialism
0:01:06 what is consequentialism we have uh some
0:01:09 kind of definition here in the first
0:01:10 slide so let's read that
0:01:11 of all the things a person might do at
0:01:13 any given moment the morally right
0:01:15 action is the one with the best overall
0:01:17 consequences to put this in more
0:01:19 succinct terms
0:01:20 if action x causes y consequence then x
0:01:23 action is morally justified
0:01:25 now this as we've kind of just alluded
0:01:27 to in a sec a second ago
0:01:29 is the opposite of the ontological
0:01:31 ethics deontological ethics says that an
0:01:33 action has intrinsic value you're doing
0:01:35 it because it's good in and of itself
0:01:38 okay it's you don't lie because lying is
0:01:41 wrong in and of itself you don't break
0:01:44 promises because breaking promises is
0:01:45 wrong in and of itself
0:01:47 you don't uh waste your talents he said
0:01:50 uh can said because wasting your talents
0:01:52 is wrong in and of itself
0:01:54 you don't commit suicide can't mention
0:01:56 because committing suicide is wrong in
0:01:58 and of itself
0:02:00 why did can't come to this conclusion
0:02:01 because he used a categorical imperative
0:02:04 and a universalizing maxim
0:02:06 he said if everyone in society does this
0:02:10 for example lying it will cause an
0:02:11 impossible dysfunctional society
0:02:14 therefore the action in and of itself is
0:02:17 wrong
0:02:18 now the consequentialists will say
0:02:20 this is problematic because there are
0:02:22 examples where
0:02:23 for example the layer example that we
0:02:26 gave in the in the previous uh session
0:02:28 that the consequences of said action can
0:02:31 cause
0:02:32 a more severe outcome for example if you
0:02:35 don't lie sometimes
0:02:36 then you can die
0:02:40 like we see this actually it's not it's
0:02:41 not a abstract thing this is not
0:02:44 hypothetical we see people
0:02:46 for example being forced to say i
0:02:48 believe my god is bashar al-assad or
0:02:50 something or if you don't say this will
0:02:51 shoot you to death it will say that uh
0:02:53 krishna is your god or something these
0:02:55 muslim minorities are being persecuted
0:02:58 in india and obviously they don't
0:02:59 believe that
0:03:00 that is the case but if you don't say
0:03:02 this will kill you
0:03:03 now on this ethic you actually don't you
0:03:06 can't say that on this ethic and in any
0:03:08 way shape or form the ontological ethic
0:03:11 but the consequentialist ethic will say
0:03:13 actually the consequences of not saying
0:03:14 this is death and therefore
0:03:17 it's completely justified for you to lie
0:03:18 at this point which we would agree with
0:03:20 of course as muslims because we do have
0:03:22 as we'll come to see
0:03:24 uh what i refer to as sharia
0:03:25 consequentialism there are aspects of
0:03:27 the sharia which accord with a certain
0:03:30 consequence however the question is
0:03:32 when it comes to consequentialism what
0:03:34 is the base and what differentiates
0:03:36 consequentialism from utilitarianism
0:03:39 utilitarianism what is the master
0:03:41 question
0:03:42 what are the two things which we must
0:03:44 consider when making a decision on
0:03:47 utilitarian grounds
0:03:50 that's true but before that there's
0:03:52 something more fundamental
0:03:54 pain and pleasure okay
0:03:56 so you have two lords bentham said
0:03:59 you have the lord of pain and you have
0:04:00 the lord of pleasure and then as you've
0:04:02 mentioned
0:04:03 we discuss we discover what is the
0:04:06 greatest pain greatest pleasure for the
0:04:08 greatest number of people so you've got
0:04:10 two steps you've got what what pain and
0:04:12 pleasure is and then we call it
0:04:14 desirability for the sake of argument
0:04:16 and then you've got you know the
0:04:18 greatest good for the greatest number
0:04:20 now it doesn't have to be that you start
0:04:22 with desirability
0:04:25 you could start off by talking about
0:04:27 welfare
0:04:28 you could start off by talking about
0:04:31 instead of it being pain and pleasure
0:04:32 you can replace pain and pleasure with
0:04:34 anything
0:04:35 and still have a consequentialist
0:04:36 ideology
0:04:38 so you can say if it's about economic
0:04:40 growth
0:04:41 we are trying to create economic growth
0:04:45 so anything which creates economic
0:04:47 growth the ends justify the means
0:04:50 for example
0:04:51 so the reason what differentiates
0:04:53 utilitarianism from consequentialism
0:04:55 more generally is that utilitarianism
0:04:57 very strictly starts off with
0:04:59 desirability as the premise whereas
0:05:02 consequentialism can replace
0:05:03 desirability with anything else
0:05:05 can replace it with economic growth can
0:05:07 replace it with we want a society where
0:05:09 we see uh the maximum number of ex in
0:05:12 society
0:05:14 you know uh families
0:05:16 you could have family consequentialism
0:05:18 you can have uh
0:05:19 for you know you just fill in the gaps
0:05:21 you can have something right it doesn't
0:05:22 have to be pain and pleasure
0:05:24 directly
0:05:26 so that's what differentiates
0:05:27 utilitarianism from consequentialism in
0:05:30 a sense consequentialism is the master
0:05:31 key category
0:05:33 and underneath it you have a
0:05:34 utilitarianism so utilitarianism
0:05:37 all utilitarians are consequentialists
0:05:38 but not all consequentialists are
0:05:40 utilitarians
0:05:41 very straightforward okay
0:05:44 now
0:05:50 there are different types of
0:05:51 consequentialism
0:05:53 so i've there's four here which i've
0:05:55 mentioned in the slides that you can see
0:05:58 rule consequentialism dual
0:06:00 consequentialism egoistic
0:06:02 consequentialism friendly
0:06:03 consequentialism now you can read those
0:06:05 definitions in your own time
0:06:07 but obviously all of those different
0:06:08 types of consequentialism start with a
0:06:11 different premise
0:06:13 egoism is a it's an ideology in and of
0:06:15 itself which will actually recover by
0:06:17 itself
0:06:18 egoism is the idea ethical egoism is
0:06:20 that you should do whatever is in your
0:06:22 best interest
0:06:23 this is uh propounded by someone like
0:06:25 ayn rand
0:06:26 she's a woman who
0:06:28 had different
0:06:30 philosophies but this was one of them
0:06:31 she was an ethical legalist so anything
0:06:33 that is in your it's the opposite of
0:06:35 altruism anything that is in your best
0:06:37 interest that's the good thing to do
0:06:39 so ethical consequential egoistic
0:06:42 consequentialism
0:06:44 is an expansion of that now
0:06:46 but there you can have as john mackie
0:06:48 mentions in his book you can with this
0:06:49 you can have contradictions
0:06:51 because sometimes you can you can
0:06:53 you want something
0:06:55 but it's in contradiction with what
0:06:56 someone else wants and so the the the
0:06:59 consequences for the majority can
0:07:01 sometimes conflict with the consequences
0:07:03 for the individual
0:07:05 and these are the kinds of tensions that
0:07:07 you may find in consequentialist
0:07:09 ideology
0:07:11 now what i want you to think about with
0:07:12 the person next to you right now
0:07:15 is are the merits and demerits of
0:07:18 consequentialism what do you think makes
0:07:20 sense
0:07:21 about consequentialism especially in
0:07:22 conjunction with what we studied last
0:07:24 week or on saturday
0:07:26 what do you think makes sense about
0:07:27 consequentialism
0:07:29 and what do you think are already the
0:07:30 weaknesses of consequentialism let's try
0:07:32 and get the synapses uh working and
0:07:34 let's activate the schemata and let's
0:07:37 speak to the person next to us and let's
0:07:39 have three minutes and then we'll come
0:07:40 back
0:07:41 and see what you guys have to say okay
0:07:44 go ahead that's the problem yeah the
0:07:45 robin hood i think you can say
0:07:48 so
0:07:49 if you have um a majority of poor people
0:07:53 and to feed them you're gonna steal from
0:07:55 the rich or do something that's against
0:07:58 the law
0:07:59 that's um a form of consequentialism i'd
0:08:03 say the ends justify the means yes
0:08:05 correct this is actually it's mentioned
0:08:07 in the literature quite quite often
0:08:09 actually but consequentialists do have a
0:08:11 reply to this
0:08:13 and the reply that they usually use is
0:08:14 that
0:08:16 if the consequences of theft
0:08:19 for example as a whole to the society
0:08:21 are worse than the money that you'd be
0:08:23 given the poor people because a a
0:08:26 a society that has plunged into like uh
0:08:29 anarchical system of people stealing
0:08:31 from each other
0:08:33 is worse than so they'll make that
0:08:35 argument but this is definitely a very
0:08:37 valid point
0:08:38 and i don't see why you can't make the
0:08:39 argument as a consequentialist that you
0:08:41 know theft
0:08:42 because it doesn't justify the means
0:08:44 uh thefts can be uh at least acceptable
0:08:47 in some circumstances maybe not in all
0:08:49 circumstances but in some
0:08:51 circumstances and you can see the
0:08:53 compatibility therefore between
0:08:55 consequentialism and socialism for
0:08:57 example or even communism you know it's
0:08:59 not incompatible for consequentialism to
0:09:01 be
0:09:02 uh to be coupled with i one of those two
0:09:04 left
0:09:06 ideal what we do have by the way
0:09:08 sessions on marxism as well so yeah
0:09:11 like um the term was
0:09:13 i think you mentioned tyranny of the
0:09:15 majority yes in this case it would be
0:09:17 tyranny of the minority where you've got
0:09:20 a small amount of um
0:09:22 people who's hoarding all the money
0:09:25 uh that in that case that
0:09:26 consequentialism would be valid yeah so
0:09:29 this is and there's arguments to be made
0:09:31 on both sides so so the economic
0:09:33 arguments that are made
0:09:35 are usually made between supply-side
0:09:37 economists okay and what we call
0:09:39 key indian economists so supply-side
0:09:41 economies of which margaret thatcher was
0:09:43 one for example right the argument she
0:09:45 made
0:09:46 which is not completely false but there
0:09:48 are some issues of it which are you know
0:09:50 there are problems with it
0:09:52 is that when the gdp of a country gets
0:09:54 bigger
0:09:55 so in other words when you have more
0:09:56 rich people in the country
0:09:58 absolute poverty
0:10:00 is diminished more relative poverty of
0:10:02 course is is is is exacerbated so if if
0:10:07 yes or trickle down economics they call
0:10:08 it true for they call it trichoderma for
0:10:10 that reason so uh but it's interesting
0:10:13 because both
0:10:14 kind of left-leaning politicians as well
0:10:17 as as kind of like you know right-wing
0:10:19 economists or really liberal economists
0:10:22 yeah supply-side economists they both
0:10:24 make an argument from consequentialism
0:10:26 both of them make the same argument one
0:10:27 of them is saying that the consequences
0:10:29 of us
0:10:30 social socialist type structure
0:10:33 like the keynesian model is better
0:10:35 and the cons some other ones like uh
0:10:38 thatcher yeah uh adam smith uh
0:10:41 obviously he's a
0:10:43 philosopher
0:10:44 and they would make the argument that if
0:10:46 ever if if there are rich people in the
0:10:49 house
0:10:50 put imagine the house is economy
0:10:52 if there's rich people in the house and
0:10:54 the family then the whole family gets
0:10:55 richer
0:10:56 even though there's more inequality
0:10:58 between family members they're better
0:11:00 off yeah he's saying that the whole
0:11:02 country gets better off
0:11:04 so what a consequence say to you know he
0:11:06 said the tyranny of the majority
0:11:09 minority he said no no yeah but but the
0:11:11 tyranny of the minority is caused
0:11:13 because of the tyranny of the majority
0:11:15 because they are not spreading the funds
0:11:17 as they should which is causing the
0:11:19 terror of the minority having to have
0:11:20 people at robin hood yeah so that's what
0:11:22 that's what a socialist would argue
0:11:24 that's what someone like left-wing would
0:11:26 left-wing economist okay indian
0:11:27 economist would argue
0:11:29 marx would argue that of course
0:11:31 but his argument is more sophisticated
0:11:32 than that it's much more sophisticated
0:11:34 than that and that's why it needs a
0:11:36 separate treatment we have a separate
0:11:37 treatment for marx in fact his one's one
0:11:39 of the most difficult ones that we're
0:11:41 going to cover here because it requires
0:11:43 real thought uh he has a different
0:11:45 starting point yeah it's marxism is not
0:11:48 as easy to deconstruct as some of these
0:11:50 other ones
0:11:51 so yeah anything else from this side
0:11:54 merits and demerits by the way we're
0:11:56 talking about it
0:11:59 yeah so um we said one of the merits was
0:12:00 that it gets rid of all of the problems
0:12:02 of absolutes
0:12:04 so basically all of the negatives from
0:12:06 deontological ethics are kind of solved
0:12:08 through yes it's a very straightforward
0:12:09 it's a low-lying fruit and it's correct
0:12:11 so for example some of the things that
0:12:12 we saw the tensions that we saw with
0:12:14 deontological ethics for example you can
0:12:16 never lie but we're in a situation where
0:12:18 someone's about to die and so on that
0:12:20 that is solved in a sense
0:12:22 because we do consider the consequences
0:12:24 with this ethic
0:12:28 or like for example if someone says i'm
0:12:30 a consequentialist
0:12:32 and then can we say to them that okay so
0:12:35 to your world view
0:12:36 you will accept that for example
0:12:39 america bombing nagasaki hiroshima
0:12:42 to kill civilians to protect their own
0:12:44 people yeah no but this is difficult i
0:12:46 wouldn't use that example the reason why
0:12:48 i wouldn't use that example is because
0:12:49 you can make arguments for and against
0:12:50 just like the economic example because
0:12:52 if they will say look there's nothing
0:12:54 that justifies the killing of that many
0:12:56 people and in fact a nuclear
0:12:59 attack like this which could which could
0:13:02 which could
0:13:03 merit a response
0:13:06 okay that is the consequences for
0:13:08 society are
0:13:10 uh overall in in the negative
0:13:12 so a consequentialist could as as easily
0:13:15 say
0:13:16 okay
0:13:17 that
0:13:18 this is not good on consequentialism as
0:13:19 they you can make an argument for
0:13:20 consequentialism that is good uncle's
0:13:22 questions the the the missing component
0:13:24 here is as follows yeah
0:13:26 no one knows the future
0:13:28 this is the missing component like i
0:13:30 don't know what's going to happen in the
0:13:31 future okay
0:13:33 and because i don't know what's going to
0:13:34 happen in the future i don't know if i
0:13:35 do action x whether that's going to have
0:13:37 consequence why or not so what we what
0:13:40 people have to do
0:13:42 in society and economy and politics and
0:13:44 so on even a family
0:13:46 is that they actually have to weigh
0:13:47 things up
0:13:48 and that's a very natural thing to do
0:13:50 how probable is it
0:13:53 that if i do x
0:13:55 that y will happen negative consequence
0:13:58 on the flip side how probable is it that
0:14:00 if i do y z will happen positive
0:14:03 consequence
0:14:04 and this is the difficult thing that we
0:14:07 all actually do in our daily lives i
0:14:08 mean
0:14:10 the simple hebrews
0:14:12 like
0:14:13 it wasn't it was to prove that
0:14:15 consequentialism is
0:14:17 is it can be wrong
0:14:20 so the we know that
0:14:22 nagasaki and hiroshima were bombed
0:14:25 knowing that japan is already
0:14:27 surrendering right there
0:14:30 but we have hindsight now we can look
0:14:32 back yeah but when you're when you're
0:14:34 basing morality on consequentialist
0:14:36 ethics you're basing it in the future
0:14:38 sense you can use consequences you can
0:14:40 use the past yeah yeah that's what
0:14:42 that's the only thing you can use right
0:14:44 so but i i can do something wrong and
0:14:46 then you say against
0:14:48 just to justify look
0:14:51 this
0:14:51 was part it was part of the
0:14:54 yes it was part of the plan yeah how to
0:14:56 do it
0:14:56 yes yes yeah it makes sense yes it does
0:14:59 maybe it wasn't needed yeah yeah yeah
0:15:01 exactly you can always you could always
0:15:02 justify backwards
0:15:04 as well
0:15:05 yeah you're right about that but
0:15:06 what i was going to say was
0:15:09 there's something more fundamental here
0:15:10 that we need to think about right
0:15:12 we said that the the predicate base
0:15:15 of consequentialism
0:15:17 is say
0:15:19 on utilitarianism we said it's pain and
0:15:21 pleasure right
0:15:22 but on other systems like for example if
0:15:24 we talk about welfare economic growth
0:15:26 whatever it is that we're saying we're
0:15:28 going to start with
0:15:29 the question is going to always be why
0:15:31 should we start with that
0:15:34 what like we did with utilitarianism
0:15:36 right
0:15:37 we said was pain and pleasure the two
0:15:38 most important things they're the gods
0:15:41 that's what bentham said
0:15:42 yeah pain and pleasure of the two gods
0:15:45 okay why
0:15:47 why
0:15:48 david hume interestingly he said
0:15:49 something he said that if you ask
0:15:51 someone why they don't like pain there's
0:15:53 no answer after that
0:15:56 he said if you ask someone why you don't
0:15:58 like pain they will have there's no
0:16:00 it's such a base level thing that you
0:16:02 don't need to have an answer for it and
0:16:04 there is some truth in this and this
0:16:06 statement we're not we're not
0:16:07 disagreeing with that it is very
0:16:08 instinctive to believe that pain is bad
0:16:11 because it hurts you
0:16:12 yeah or in fact you can't say it because
0:16:13 it hurts me right
0:16:16 does that fall into that no this is more
0:16:18 consequentialism we're still on
0:16:19 consequentialism now what we're saying
0:16:21 is
0:16:22 what are the
0:16:24 consequent the consequences of what
0:16:26 we're trying to achieve
0:16:29 are we trying to achieve economic growth
0:16:31 are we trying to achieve okay we
0:16:33 the barometer for our success is how
0:16:35 many stable families there are on
0:16:37 society how many hospitals there are in
0:16:38 our society how many kitchens there are
0:16:41 whatever it is
0:16:42 whatever you start off with and say well
0:16:44 if i want
0:16:45 x
0:16:46 then anything which goes against this
0:16:49 in consequence negative or goes for it
0:16:51 in positive consequence is
0:16:53 a step towards the right direction
0:16:55 wherever x is whatever this thing we're
0:16:57 starting with how can that be justified
0:17:00 i want economic growth why should we
0:17:02 want
0:17:03 profit maximization over even revenue
0:17:05 maximization i mean there's a difference
0:17:07 by the way in economics why should we
0:17:09 want
0:17:10 um
0:17:11 why should we want uh desirability over
0:17:16 a stable society
0:17:19 why why is that the bar why is it yeah
0:17:21 exactly yeah because because we talked
0:17:23 about the consequences of that of
0:17:25 pornography and alcohol yeah it's
0:17:27 maximum pleasure but that's coming up
0:17:28 consequences you know which we see i was
0:17:31 reading a book called irreversible
0:17:32 damage yeah and was talking about how
0:17:35 you know depression and suicide in a lot
0:17:37 of girls for example as well have
0:17:39 increased so we live in a society of
0:17:40 maximum pleasure but we've seen we're
0:17:42 not seeing fruits you know we've seen
0:17:44 rotten fruits if you so can we not say
0:17:46 that your bar is maximum pleasure but
0:17:48 now it's coming at the cost of people
0:17:50 have it paid
0:17:52 you could say that absolutely you can
0:17:54 say that absolutely but you can what's
0:17:56 even what's even easier
0:17:58 is to ask them why should we maximum
0:17:59 pleasure yeah why yeah why it's just a
0:18:02 simple question of why like why should
0:18:04 it be this
0:18:05 like you ask a socialist why should it
0:18:06 be uh equality in economics you ask a
0:18:10 supply-side economic economist why
0:18:12 should it be economic growth
0:18:13 they all have an answer because this is
0:18:15 a better society it's more fair it's
0:18:16 whatever then you keep asking why then
0:18:19 you've got problems because the level of
0:18:20 justification will get harder and harder
0:18:23 then you have to get something which is
0:18:25 a meta ethic if you like that's going to
0:18:27 explain this which in the most in most
0:18:30 instances you won't have such a thing so
0:18:31 what would they say if i said for
0:18:33 example like why maximum pleasure and
0:18:35 they said um because it brings happiness
0:18:38 uh that's a circular argument this is
0:18:40 exactly what we found what what would
0:18:42 they say like imagine you're speaking to
0:18:43 yeah what would they usually say what's
0:18:44 their like
0:18:46 what we we we don't deal with what the
0:18:47 lay people say we deal with what the
0:18:49 with what the heads of this okay what's
0:18:51 what the heck say so who remembers on
0:18:53 utilitarianism you were here with
0:18:54 utilitarianism
0:18:59 so who remembers the justification that
0:19:01 uh that mill used
0:19:04 it's desirable because you desire kind
0:19:06 of thing yeah yeah yeah yeah yeah he was
0:19:08 using he was saying that desirability
0:19:11 is desired
0:19:13 because
0:19:14 it's desired
0:19:16 it's desirable
0:19:18 yeah yeah yeah
0:19:20 but he
0:19:21 yeah but the thing is
0:19:23 the only way this doesn't
0:19:25 this can make sense is if we if we talk
0:19:27 if we if we differentiate between
0:19:29 desirability and desire
0:19:31 now if we differentiate between
0:19:32 desirability which is the is and the or
0:19:35 that ought to be desired
0:19:37 then now you have the naturalistic
0:19:38 fallacy which we talked about we have
0:19:40 the how do you get from there is to the
0:19:42 ought how do something is the case to
0:19:44 how it should be the case
0:19:46 what david hume referred to as a
0:19:47 naturalistic fallacy because you can be
0:19:49 a certain way it doesn't mean you should
0:19:50 be you can be a certain way it doesn't
0:19:52 mean you should be that way
0:19:54 like for example if you say i was born
0:19:55 gay
0:19:57 okay i was born gay doesn't mean you
0:19:58 should be gay
0:20:00 no this is good because if you say i was
0:20:02 born gay therefore i should be gay
0:20:04 that's the naturalistic fallacy
0:20:05 according to david hume
0:20:06 if i was born gay therefore i should be
0:20:08 or if i was born incestuous if we take
0:20:10 the olympus complex seriously then
0:20:12 therefore i should be that that jump
0:20:15 is an unjustifiable jump that there's
0:20:18 that is a fallacy in fact
0:20:21 that's a statement filled with
0:20:22 philosophy
0:20:25 so you weren't there for the people
0:20:27 not felicity that's something else
0:20:30 you know but
0:20:31 you see what i'm talking about here so
0:20:33 there is or and if they that's why it's
0:20:35 very important
0:20:37 to it's just an infinite regress of
0:20:39 justification there's going to be a
0:20:40 button there's going to be a bottomless
0:20:42 pit of
0:20:44 no answers
0:20:46 and so we have to question what the
0:20:48 premise is
0:20:49 what are the consequences look at sam
0:20:50 harris he's not even in the literature
0:20:53 by the way because he's not a ethicist
0:20:55 or something like that but he's made a
0:20:56 little book about and he's talking about
0:20:58 consequentialism and stuff
0:21:00 but welfare and these things but once
0:21:02 again why should we have that
0:21:06 and then he's going to have some kind of
0:21:07 circular justification just like mill
0:21:08 did they all have some kind of circular
0:21:10 justification just like mill did because
0:21:12 there's no way out we should have it
0:21:14 because we like it because it's good or
0:21:16 that that's it's good because it's good
0:21:17 basically is that what i want to say
0:21:19 it's good because it's good that's
0:21:21 circular argument that's that's circular
0:21:22 argument 101 yes
0:21:25 like a better response on this side be
0:21:28 like maybe like going to like a more
0:21:29 biological level where you say that
0:21:32 pleasure is desired because
0:21:34 you know um
0:21:35 because it leads to our survival um as a
0:21:38 species but why is why is survival good
0:21:41 why should we why should we you see
0:21:44 you've moved to from uh from is to ought
0:21:48 you're assuming that survival is a good
0:21:50 thing
0:21:50 it's not necessarily assumed in ethics
0:21:53 that is a good thing
0:21:54 that's uh why is a good thing well
0:21:56 because evolutionarily we there's two
0:21:58 things that are connected
0:21:59 that existence is just better than
0:22:01 non-existence like being able to exist
0:22:04 and being able to survive and continue
0:22:06 to live is is just better no you
0:22:08 couldn't because hitler his
0:22:10 non-existence is better than his
0:22:11 existence for example and if we have the
0:22:13 world full of uh hitler's then you can
0:22:15 and on consequentialism right but even
0:22:17 even forget what hitler
0:22:19 that premise itself would be difficult
0:22:21 to prove
0:22:22 why is existence better better than
0:22:24 better than in what sense
0:22:26 in the sense that it's morally it's
0:22:28 morally superior but if you say it's
0:22:30 morally superior then you're already
0:22:32 begging the question so you're assuming
0:22:34 the premise in the conclusion
0:22:36 yeah
0:22:37 it's like you know there's a guy who
0:22:38 comes to speak his corner and he says
0:22:40 like stop having kids have you seen him
0:22:43 yeah you remember that guys he comes you
0:22:45 should have kids because this world is
0:22:46 full of you know
0:22:48 antenatalism yeah yeah and then he's
0:22:49 like you know
0:22:50 there should be no existence
0:22:52 so he could argue that in it
0:22:54 yeah it's good that you brought that up
0:22:56 i mentioned this before peter singer
0:22:57 who's a utilitarian
0:22:58 he he actually wrote that disabled
0:23:00 children can be killed
0:23:02 because the pain is more than pleasure
0:23:05 it's the pain it's more it's kind of
0:23:06 like that yeah but it's the pain is more
0:23:07 than the pleasure and stuff like that
0:23:09 and there's all these comparisons
0:23:10 between humans and animals
0:23:12 and black people and slavery and animals
0:23:14 and stuff like that that he makes in his
0:23:16 public uh because he thinks it's all you
0:23:19 know the same thing
0:23:20 but there was one thing that i saw peter
0:23:22 singer do that just a tangent but that i
0:23:24 found interesting and i used it in one
0:23:26 of the uh fundraising
0:23:27 because for him he's a consequentialist
0:23:29 right he's utilitarian and he said he
0:23:31 had this uh he made this ted talk
0:23:34 where in the background he had like uh
0:23:36 he said if a child died right outside
0:23:39 this
0:23:40 place that they're in yeah
0:23:42 is it any is it any different to someone
0:23:44 dying three thousand miles away and he
0:23:46 showed some kids in i don't know
0:23:48 whatever country it was
0:23:49 and he said basically all my worldview
0:23:51 is the same
0:23:53 you know
0:23:54 the truth is i mean we can go into the
0:23:56 technicalities
0:23:57 but i used that for a fundraiser one
0:23:58 time i said look
0:24:00 were you there
0:24:04 lots of hands up yeah we had lots of
0:24:06 hands up there we had a lot of money
0:24:08 yeah yeah so so anyway point being
0:24:13 long story short how do we prove the
0:24:14 premise you can't it's unprovable and
0:24:16 therefore it's better to get them to
0:24:18 justify themselves than for us to try
0:24:19 and justify ourselves that's the golden
0:24:21 rule in it
0:24:22 it's always better to put someone the
0:24:24 back foot
0:24:26 than be on the back foot and that's
0:24:27 something anthony joshua should have
0:24:29 learned before he went in for a second
0:24:31 that's what i should have known okay
0:24:35 all right now let's um
0:24:37 let's quickly go to the next bit of this
0:24:39 uh
0:24:41 so these are some of these from the
0:24:43 i don't know it's the encyclopedia the
0:24:45 online or the famous one this is some of
0:24:47 the arguments of full consequentialism
0:24:49 right
0:24:50 actions are transient things soon gone
0:24:52 but the basically the results are going
0:24:54 to stay forever
0:24:55 really is that an argument not really um
0:24:58 they say well
0:24:59 maximizing goodness is always better
0:25:01 than not maximizing goodness but once
0:25:03 again why and how and there's not really
0:25:05 good
0:25:06 arguments there arguments against
0:25:07 consequentialism
0:25:09 what about family bias like for example
0:25:11 i will mention this it's actually
0:25:12 mentioned in the poem as well like if
0:25:15 you've got two kids drowning in the sea
0:25:17 one of them is your own child and one of
0:25:18 them is another person's child
0:25:20 okay
0:25:21 now you've only got the resources to
0:25:24 save one of them which one should you
0:25:26 say
0:25:26 now we would say we have a family
0:25:28 allegiance
0:25:29 um that you should save your own child
0:25:31 first if they're the same age and the
0:25:33 same whatever right and then if you have
0:25:35 time you go for the other child
0:25:37 whereas a consequentialist or even an
0:25:39 individualist
0:25:41 doesn't have the liberty to make that
0:25:42 kind of claim because on what basis are
0:25:45 we prioritizing family kinship ties over
0:25:49 not prioritizing them
0:25:51 the consequence of this child dying is
0:25:53 the same as this child dying
0:25:55 you see so there's no reason to favor x
0:25:58 over y
0:25:59 so it puts into question things like
0:26:02 family ties why are you even providing
0:26:04 for your family
0:26:06 not the giving charity
0:26:08 some will say well actually if you don't
0:26:10 provide for your family your mental
0:26:11 health will not be in order therefore
0:26:13 you won't be able to do it and these are
0:26:14 good arguments but no no problem this
0:26:17 the same problems still apply
0:26:19 like you can't really justify family
0:26:20 ties in that sense and in the kinship
0:26:23 and community ties and patriotism and
0:26:25 all these kind of things you can't
0:26:26 really justify in consequentialism in a
0:26:27 very clear way
0:26:30 what about good intentions and this goes
0:26:31 back to the poem like because if
0:26:33 everything is to do with your intention
0:26:34 uh if everything is to do with the
0:26:36 consequences
0:26:37 then what about the intention and what's
0:26:39 mentioned in the poem
0:26:40 the mentions
0:26:42 is like for example
0:26:43 if
0:26:45 what differentiates
0:26:46 the surgeon
0:26:48 cutting the
0:26:51 the person being operated on with the
0:26:52 landsat
0:26:54 with the surgical tool
0:26:55 versus the criminal putting the knife
0:26:58 in someone's belly
0:27:00 it's only the intention that
0:27:01 differentiates the two things but if
0:27:03 consequences are the only important
0:27:05 thing
0:27:06 and intentions are not then really these
0:27:08 two are the same or to bring it closer
0:27:10 to home we can say this
0:27:12 the one who commits manslaughter the one
0:27:13 that commits murder
0:27:15 like if i run some kid over
0:27:17 because i'm driving erratically and i
0:27:20 didn't mean to
0:27:21 run him over it has the same consequence
0:27:23 as if i wanted to kill that kid
0:27:26 with my intention
0:27:27 so if we eliminate the intention we have
0:27:30 a real problem here because now we're
0:27:31 putting manslaughter and murder in the
0:27:33 same category imagine this
0:27:35 a society with this kind of logic
0:27:38 imagine what kind of uh but they would
0:27:40 probably argue that itself
0:27:43 is a consequentialist understanding
0:27:45 that
0:27:46 if because the
0:27:48 in
0:27:49 not putting intentions into the equation
0:27:51 but then consequentialism is all about
0:27:54 the thing itself having no intrinsic
0:27:56 value so it's all about consequences so
0:27:58 on on a purely consequentialist level
0:28:00 it's the same thing manslaughter and
0:28:02 murder are the same thing because they
0:28:03 have the same consequence
0:28:07 after you've murdered them it's
0:28:09 different if you force manslaughter
0:28:10 there'll be more consequences if you
0:28:12 murder them and if they'll just yeah
0:28:13 they can make those arguments for sure
0:28:15 they can make those arguments definitely
0:28:17 but on a purely numerical level like the
0:28:20 family is going to be upset either way
0:28:22 yeah you see
0:28:23 i mean premeditated murder versus
0:28:26 non-prepayment like
0:28:27 the family might feel the same way if
0:28:29 the person is being murdered
0:28:31 because of an act of passion versus the
0:28:34 person was calculating it before and
0:28:36 thinking about how to do it these things
0:28:38 are all demarcated in law and for good
0:28:40 reason because
0:28:41 it shows as we would argue at least it
0:28:44 shows us a level of
0:28:46 sinister
0:28:47 conniving
0:28:49 um
0:28:50 intention
0:28:51 when someone's thinking about it and
0:28:53 trying to execute it going back to
0:28:54 emmanuel kant's famous quote the cool
0:28:57 blooded
0:28:58 scrounger
0:28:59 he said the cruel the school the cool
0:29:01 blood is crowned was more dangerous than
0:29:03 the erratic one because he's planning
0:29:05 he's doing this he's doing that this
0:29:07 guy's more dangerous the cool-headed one
0:29:08 that could think with his without his
0:29:09 emotions
0:29:10 but these intentions are important in in
0:29:12 case law and we have seen
0:29:14 that to to blur the distinctions is
0:29:17 going to be difficult
0:29:22 so
0:29:23 is that are there some things which can
0:29:24 never be good now we talked already
0:29:26 about peter singer and killing disabled
0:29:27 babies
0:29:28 like can that be justified
0:29:31 and with that we'll transition to our
0:29:32 own system now the islamic sharia system
0:29:35 okay
0:29:36 because we've kind of covered two this
0:29:38 we've we've done a lot of um
0:29:41 kind of consequentialism we've gone into
0:29:42 the feminism and so on
0:29:44 but let's talk about sharia
0:29:46 consequentialism because there are some
0:29:48 principles in usual
0:29:51 in the principles of jurisprudence which
0:29:53 are consequentialist in nature which
0:29:55 which assess the consequences
0:29:57 and some of those principles are for
0:30:00 example abdallah or uzel or that harm is
0:30:03 to be averted
0:30:04 it's not like we don't we as muslims we
0:30:06 don't agree with this we agree that harm
0:30:08 should be averted
0:30:10 so much so that it's a principle in the
0:30:11 religion of islam
0:30:13 and we have uh
0:30:20 this is another principle which is that
0:30:22 warding off
0:30:24 that which is
0:30:26 corrosive or corruptive
0:30:28 or disadvantageous is preferred or
0:30:32 put forward above bringing forward
0:30:34 benefits so in other words in our
0:30:36 analysis
0:30:37 we first try and get rid of
0:30:39 negative factors before we
0:30:42 think about bringing in positive ones
0:30:45 this is these are all kawaii they're all
0:30:46 principles in the religion of islam
0:30:51 we have the principle of mafsa or the
0:30:53 idea of common
0:30:55 uh
0:30:55 you know not pain
0:30:58 you could argue is
0:31:00 corruption
0:31:01 how would you translate
0:31:03 corruption
0:31:05 if you get about if you get uh you see
0:31:08 if you get yeah if there's any muscle is
0:31:10 common interest on the other hand yeah
0:31:13 common interest what is in the common
0:31:14 interest of the people so we have a lot
0:31:16 of consequentialist reasoning but i'll
0:31:18 give you i'll give you another
0:31:21 example okay
0:31:23 and this is mentioned an essay of mine
0:31:25 that i wrote and you can find it on
0:31:26 camera called the uk yeah because we
0:31:29 mentioned this
0:31:30 before
0:31:31 corruption yeah all right so um as i
0:31:34 mentioned
0:31:35 al ghazali in his book
0:31:37 okay so just to give you a kind of
0:31:39 background of this
0:31:40 al juani wrote his book
0:31:42 and then
0:31:43 he wrote his book these are very
0:31:45 important books in
0:31:47 mustafa is one of the most compendous
0:31:50 and uh celebrated books in the uh
0:31:53 principles of jewish prudence in islamic
0:31:55 history
0:31:56 one of the top three books in fact so
0:31:57 much so that uh ibn he wrote like a
0:32:00 muhtasa or a summarized version of it
0:32:01 called roblox another which is basically
0:32:03 a summarized version of it
0:32:05 but he has he has a section where he
0:32:07 talks about muslah he talks about the
0:32:09 common interest
0:32:11 and he says if in order for
0:32:13 to be
0:32:14 or in other words for it to be actually
0:32:16 actualized three things must be look at
0:32:18 this this is very very sophisticated
0:32:20 that these even now consequentialists
0:32:23 have not mentioned these three
0:32:24 principles
0:32:25 and it's very very uh
0:32:27 very detailed he says three principles
0:32:29 have to be put in place
0:32:31 number one it has to be kuli
0:32:34 which means it has to affect the whole
0:32:36 muslim community for instance
0:32:39 or if not the muslim community whoever
0:32:40 is living under muslim rule which
0:32:42 includes christians jews whoever yeah
0:32:45 number two
0:32:49 means that we're sure that this is going
0:32:52 to happen
0:32:53 so you know someone could argue we'll
0:32:55 never know for sure what's going to
0:32:57 happen in the future you could argue
0:32:58 therefore
0:33:00 it's at least a high possibility over 90
0:33:02 yeah
0:33:04 and then the third thing is
0:33:06 it has to be a necessity if these three
0:33:08 things are not in place then you cannot
0:33:11 judge
0:33:12 on maslaha now wastaruri is connected to
0:33:14 the five things islam
0:33:17 came to protect
0:33:19 which is religion
0:33:21 which is
0:33:22 a deen
0:33:28 your intellect
0:33:30 your uh
0:33:32 your male your wealth
0:33:34 and also
0:33:36 you are your
0:33:38 your honor
0:33:40 yeah so if something
0:33:44 is a threat
0:33:45 to any of those five things
0:33:48 it's a direct threat to any of those
0:33:49 five things it becomes
0:33:51 necessary
0:33:53 to avert it
0:33:54 and it fulfills the condition of
0:33:58 so here let me give you some live
0:33:59 examples
0:34:01 and
0:34:02 we have to be careful with these
0:34:03 examples
0:34:04 there is an example of tatars
0:34:06 it's referred to as
0:34:08 and he mentions this
0:34:11 and he says
0:34:12 imagine a situation where there's an
0:34:13 enemy army that's going to overtake your
0:34:15 thing
0:34:17 and you are
0:34:18 forced to use human shields
0:34:21 like from the civilian population
0:34:23 he said is that ever justified
0:34:25 now al hazali comes to the conclusion
0:34:26 that it can be justified in very intense
0:34:28 conditions
0:34:30 okay he comes to that conclusion
0:34:32 obviously there are those who disagree
0:34:34 with al ghazali on this point
0:34:36 uh but he says imagine you're in a
0:34:38 situation where you're in the sea
0:34:40 and there's maybe three of you on the
0:34:41 ball yeah
0:34:42 and the sea is sinking the the ship is
0:34:44 sinking
0:34:46 and in order for you to be saved one of
0:34:48 you have to jump off the boat
0:34:52 no no no so he says
0:34:54 is it the question is it ever
0:34:55 justifiable to throw one guy off the
0:34:57 boat
0:34:58 he says it's never justifiable to do
0:35:00 that
0:35:01 why
0:35:02 because it doesn't fulfill all the
0:35:03 criteria
0:35:06 so it's not in the best interest of all
0:35:08 three of them
0:35:09 it's only in the best interest of two
0:35:10 out of three of them which is 66 percent
0:35:13 for it we call it has to be in the best
0:35:14 interest of 100 of them
0:35:16 even though there is discussion about
0:35:17 that some say well two out of three is
0:35:18 the majority blah blah blah but if
0:35:20 there's three of them and he throws one
0:35:22 out it's never uh possible and we
0:35:24 already spoke about by the way
0:35:26 the situation where if someone said or
0:35:28 rape your rape your own daughter or have
0:35:30 sex with your own daughter advice will
0:35:32 kill you
0:35:33 it's never
0:35:34 no one has ever said that that's
0:35:35 justifiable in islam but you have to
0:35:37 take the bullet basically that
0:35:38 consequences will never accept the rape
0:35:40 of one year old otherwise will kill you
0:35:42 no i'd rather take the bullet because
0:35:44 uh taking the bullet you can never
0:35:46 defile because it goes you're taking two
0:35:49 things
0:35:50 you're you're taking her honor the
0:35:51 one-year-old because she has honor
0:35:53 and you may be taking her life as well
0:35:55 or even if not you're really harming
0:35:57 that one-year-old
0:35:58 uh so these two things are in place
0:36:02 so so we these are the principles of
0:36:04 islam that we do have a sharia
0:36:06 consequentialism but it's very it's much
0:36:08 more complex and sophisticated than my
0:36:11 postulate then what do you have in the
0:36:12 west which
0:36:13 they don't have these principles
0:36:15 of
0:36:17 and it has to be this and it can be
0:36:18 working in this situation and this
0:36:20 situation is the ontological kind of uh
0:36:22 justification this you know they don't
0:36:24 have that it's usually either all
0:36:27 it's usually either or and so we have a
0:36:29 combination of both and for us to be
0:36:32 consequentialist there are criterion
0:36:34 that must be met
0:36:36 before such consequentialism is
0:36:37 justified
0:36:40 um yeah we talked about the two examples
0:36:42 now the case study we're going to be
0:36:44 talking about today is the age of aisha
0:36:48 you know this is one of the most common
0:36:49 questions that we find in the muslim
0:36:52 world or your prophet married a
0:36:53 nine-year-old
0:36:56 and the truth of the matter is
0:36:58 historically this is the strongest
0:36:59 opinion that he in fact this was uh
0:37:01 consummated at nine is there any other
0:37:04 opinions yes i mean could you could one
0:37:08 is one of this believer if they believe
0:37:10 that she was 16 or 18 no definitely not
0:37:12 because number one this hadith is mahouf
0:37:15 which is it's a it's a hadith which is
0:37:17 not narrated by the prophet by a sahabi
0:37:20 it's not murphy
0:37:22 and you can have academic discussion so
0:37:24 long it is as it is academic discussion
0:37:26 that's not trying to moralize
0:37:28 but you know you're going to find it
0:37:30 difficult
0:37:30 because it's in bukhari for one and
0:37:33 number two it's actually narrated by
0:37:34 aisha herself and in more than one
0:37:37 narration
0:37:38 so instead of going down the route of
0:37:40 being on the back foot and saying the
0:37:42 narration is weak
0:37:45 the real question is why is it wrong
0:37:48 and why is it wrong
0:37:50 why is it wrong to have intercourse with
0:37:52 someone
0:37:54 of that age for example yeah
0:37:58 why is it wrong to have intercourse with
0:38:00 someone at that age if we believe she's
0:38:01 be present pubescent etc is it on
0:38:04 consequentialist grounds or is it on the
0:38:06 ontological grounds now there's only two
0:38:08 options here
0:38:10 virtual ethics has nothing to do with
0:38:12 this
0:38:13 so either on consequentialist grants so
0:38:15 you say it's on consequentialist grants
0:38:17 so what are the consequences harm we've
0:38:19 just discussed
0:38:20 islam a legitimate consequence yes
0:38:23 but what we're saying is there was no
0:38:25 harm that was done because it was
0:38:27 societally accepted at that time
0:38:29 we're saying that the structures of
0:38:30 society had things in place for that
0:38:33 and you have to show us evidence at
0:38:35 least from the historical record that
0:38:36 there was harm
0:38:37 and if there was harm we would agree
0:38:39 with you
0:38:40 in fact and if there is harm we would
0:38:42 agree with you as well which is why we
0:38:43 don't say that muslims in this society
0:38:46 should go and do that because
0:38:49 the the sorry to say the 12 and 13 year
0:38:52 olds and 14 year olds of today are
0:38:53 nothing like
0:38:55 the 12 and 13 year olds in ancient
0:38:56 athens or sparta or
0:38:58 any any medieval time the structures of
0:39:00 society are different and therefore harm
0:39:02 is different some of them will come with
0:39:04 us and come to us and say well your
0:39:06 prophet married a nine-year-old why
0:39:07 don't you marry a nine-year-old why
0:39:09 would you allow your marriage a
0:39:10 nine-year-old to get married we say no
0:39:12 why not because it would harm her why
0:39:14 would it harm her because the society
0:39:16 does not support what it used to support
0:39:19 where you have 10 year olds in sparta
0:39:20 you were mentioning in the previous
0:39:22 session they knew how to fight better
0:39:23 than men do now in the american army
0:39:26 we don't have that culture no more we're
0:39:28 living in an age of education we're
0:39:29 living in this and that so we say if
0:39:31 it's a consequentialist reasoning and
0:39:33 the consequence is harm therefore we
0:39:35 agree
0:39:36 but we are denying that at that time in
0:39:38 that place that harm was there
0:39:40 and we have our arguments for that what
0:39:42 they say is deontological consequence so
0:39:44 we'll say you have to now the burden of
0:39:45 proves upon the one that's making the
0:39:47 claim
0:39:48 if the if this is meant to be wrong at
0:39:50 all times in all places
0:39:52 with no exception
0:39:54 no consequential exception then that
0:39:57 must be demonstrated
0:39:59 but then you'll fall on your face if
0:40:00 you're trying to demonstrate that
0:40:01 because how can you possibly fall how
0:40:03 can you possibly demonstrate
0:40:05 unless you want to use canteen i think
0:40:07 even then it's difficult
0:40:09 how can you possibly demonstrate such a
0:40:11 thing
0:40:12 so on both it's either consequential or
0:40:14 it's not what principle would um
0:40:19 deontological ethics what principle
0:40:21 would they like what's the universal
0:40:23 principle that's this might just break
0:40:25 in
0:40:27 once again the the main one that they
0:40:29 could argue is harm
0:40:30 but it's then we're but we're back to
0:40:32 stage one yeah
0:40:33 because the
0:40:34 the whole thing is
0:40:36 you know we are denying that at that
0:40:38 time in that place with that person and
0:40:39 that you know in those circumstances
0:40:41 that there was harm
0:40:42 and that we were saying that the burden
0:40:44 of proof is the one that upon the one
0:40:45 that's making the claim so how can you
0:40:46 prove instead of us being always on the
0:40:48 back foot and this and that say bringing
0:40:50 you know a video of some 10 year old
0:40:53 that has had her menstruation or
0:40:54 something like that you know and she's a
0:40:56 big woman now and there are many of them
0:40:59 by the way this we're not going to say
0:41:00 that in this society we're very clear
0:41:02 you know on this we have to be very
0:41:04 we're not saying in this society that
0:41:05 this applies because the harm we can see
0:41:07 the harm psychological the sight is
0:41:09 different but what we are denying is
0:41:11 that this is a categorical thing that is
0:41:13 wrong in all places all times that's
0:41:15 what we're denying we're denying that
0:41:16 it's con is contextual
0:41:18 any questions on this point
0:41:20 you know with this particular argument
0:41:23 i've never heard of any ancient source
0:41:26 or any source up until
0:41:28 the the 20th century where it's been you
0:41:32 know described as a bad thing it's a new
0:41:34 phenomenon yeah yeah and we kind of we
0:41:36 kind of went through this if you
0:41:38 remember when we were talking about the
0:41:39 age of consent and when it changed in
0:41:41 the uk it was about less than 100 years
0:41:42 ago and it was connected to education so
0:41:45 we can track the historical record on
0:41:46 when this thing changed and why it
0:41:48 changed in fact it was the first way of
0:41:50 feminism and the uh the the marriage act
0:41:52 and the consent all those kind of things
0:41:54 and they they connected it to education
0:41:55 and we can see it but just because that
0:41:57 is the case it doesn't mean that this
0:41:59 becomes a universal
0:42:01 moral all of a sudden
0:42:03 you know it's a very difficult argument
0:42:05 to make and you're right because i think
0:42:08 the first orientalist to mention this
0:42:09 that we've come across is david
0:42:10 margolias for example and this was in
0:42:12 the early 1900s
0:42:14 in the early 1900s where feminism was
0:42:16 becoming a thing and where these
0:42:17 arguments started to become more more
0:42:18 popular
0:42:19 but like we said before this was not a
0:42:22 universally understood thing
0:42:23 and it is context specific we are making
0:42:26 the claim that is context specific
0:42:28 if the claim that is if the claim that
0:42:30 it's not context-specific has to be made
0:42:32 then one has to say that in all cases
0:42:34 all times harm will be there
0:42:36 which we know is impossible that's an
0:42:38 impossible claim
0:42:39 you know uh
0:42:41 that would suggest that in any situation
0:42:42 where any one of that age were to do
0:42:45 that in any time in human history that
0:42:47 they would be harmed which is which
0:42:49 could shown be shown to be demonstrably
0:42:51 false
0:42:52 so what we're saying is that in today's
0:42:54 age yes
0:42:55 so we have to be very careful
0:42:57 not to conflate the ancient medieval
0:42:59 times and this is why us all is
0:43:01 important
0:43:02 like we're not we're not saying
0:43:04 the very segmental approach
0:43:06 you know
0:43:07 the the the contextual approach
0:43:09 approaches again hadith and saying well
0:43:11 no let's get the hadith and understand
0:43:13 does this apply today now
0:43:15 there's many hadiths were the slavery
0:43:16 things and this and that were you going
0:43:17 to apply no you can't apply
0:43:19 uh you know in south london they were
0:43:21 talking about um if you get a girlfriend
0:43:24 if you say you know get a girlfriend
0:43:25 basically and so she says i'm your slave
0:43:26 now you can sleep with her
0:43:28 have you seen this kind of fettles
0:43:30 it's these are the kinds of
0:43:31 ridiculousness that people come out with
0:43:32 because
0:43:33 they don't understand the pragmatic
0:43:35 nature and there is a pragmatic nature
0:43:37 yeah honestly we're lying the pragmatic
0:43:39 nature of islamic huh no it's not valid
0:43:43 it's not valid it's not valid
0:43:46 anyway with that guys we conclude we're
0:43:48 going to continue our discussion uh on
0:43:49 this very topic
0:43:51 and we're going to have one team that's
0:43:52 going to be asking this very question
0:43:54 and one team that's defending themselves
0:43:56 and then we're going to see how it goes
0:43:58 and for those watching at home i hope
0:43:59 you've understood our position and
0:44:01 you've under you have taken heed and
0:44:03 taken notes as you should have been
0:44:04 wasallam alaika