F = AB + C'D'
how can i factor out this expression to obtain product of sums ,
i think we have to make up two new terms each with zero
like F = AB + C'D' + AA' + B'B or something of the sort but how to do it exactly
any idea appreciated
F = AB + C'D'
= (A' + B')' + (C + D)' ; De Morgan on each term
= ((A' + B')(C + D))' ; De Morgan again
here is a hint :
Morgans law
AB= ( A' + B' )'
Here is an example
F = ab' + ad + c'd + d'
F'= (ab' + ad + c'd + d')' = (ab')' . (ad)' . (c'd)' . (d')' ---> a'.b' =(a'+b') De Morgans law = (a'+b)(a'+b')(c+d')(d)
Hence Product of sums.
Your case:
F = ab +c'd' = (a'+b')' + (c+d)' = (a'+b').(c+d)
Just use De Morgans law ;)
Related
I'm working on some logic homework and I can't figure out the next step in reducing the number of literals. Any help would be greatly appreciated.
(A + B + C) (A’B’ + C)
A’B’C + AC + BC + C
C(A’B’ + A + B + C)
C((A + B)’ + A + B + C)
I'm pretty sure I use the associative law next, but I don't understand how the not operator is distributed when rearranging.
From the point where you left:
C((A + B)’ + A + B + C)
C(1 + C) ; X' + X = 1 applied to X = A + B
C(1) ; 1 + <anything> = 1
C ; <anything>1 = <anything>
By applying the conventional 12 rules of simplification i am unable to simplify this expression!
objective is to simplify the expression to make it as contract as possible and can easily be implemented.
A.B'.C + B.C + A.C'
First method to solve it using Boolean algebra:
A.B'.C + B.C + A.C'
C.(B + A.B') + A.C' (Take C as common factor)
C.(B + A)(B+B') + A.C' (Using Distributive Law)
C.(B + A).1 + A.C'
B.C + A.C + A.C'
B.C + A.(C + C') (Take A as common factor )
B.C + A.1
A + B.C
second method is using k-map
Similarly to Abdul's proof:
A.B'.C + B.C + A.C' = (A.B'+ B).C + A.C' (common factor C)
= (A + B).C + A.C' (see below)
= A.C + B.C + A.C' (distribute C)
= A + B.C (A.C + A.C' = C, right?)
Why A.B' + B = A + B?
A.B' + B = A.B' + A.B + B (because B = A.B + B since B includes A.B)
= A.(B'+ B) + B (common factor A)
= A + B
{a(b+c)+a’b}’
using demorgans theorem I got a'+ b'c'a + b' then I factored b' out of b'c'a + b' to get b'(1+c'a) which just turns into b'. plugging it back into the equation I'm left with a'+b'. Is that correct or do I have this all wrong?
{a(b+c)+(a'b)}' = (a (b+c))' . (a'b)'
= (a' + (b+c)') . (a+b')
= (a' + (b'.c')) . (a+b')
= (a.a') + (a'b') + (ab'c') + (b'c')
= 0 + a'b' + b'c'(a+1)
= a'b' + b'c'(1)
= a'b' + b'c'
= b'(a'+c')
Hello I am trying to simplify this expression (proving consensus expression):
(a + b)(b'+ c)(a + c) = (a + b)(b'+ c)
I was thinking of adding (a+b)(b'+ c)(a + c + b' + b), but I don't know what to do after.
Take a look at this:
(a + b)(b' + c)(a + c)
= (ab' + 0 + ac + bc)(a + c)
= (ab' + ab'c +ac + ac + abc + bc)
= (ab' + ab'c + ac + abc + bc)
= (ab'(1+c) + ac + bc(a + 1))
= (ab' + ac + bc)
= (ab' + c(a+b))
= (ab' + bb' + c (a+b))
= (a+b)(b' + c)
The key step is realising that bb' = 0 so you can safely add that term without affecting the result in the penultimate step.
(A+C')(B'+C')
AB'+AC'+BC'+C'
AB'+(A+B+1)C'
We Know ( 1+anything =1)
So required expression AB'+C' .
I'm taking a class on digital logic and I am having a hard time with boolean algebra and simplifying logic functions with it. I have tried answering this problem several times and I keep coming to the answer "1", which I feel is absolutely wrong.
The question is
Consider the logic function f(a,b,c) = abc + ab'c + a'bc + a'b'c + ab'c'. Simplify f using Boolean algebra as much as possible.
I have tried solving it several ways using the boolean identities given in my textbook and from lecture, but I keep coming to something like c + 1 which is equivalent to 1, which I don't feel is the correct answer considering the next question in the problem.
Here is my last attempt:
f(a,b,c) = abc + ab'c + a'bc + a'b'c + ab'c'
= a(bc + b'c + b'c') + a'(bc + b'c) # Distributive, took out the a and the a' separately.
= (a + a')((bc + b'c + b'c') + (bc + b'c)) # Distributive(?), took out the a and a' together (This is probably where I screwed up).
= (1)((c + b'c') + c) # a + a' = 1; bc + b'c = c (Combining).
= c + b'c' + c # cleaned up a little.
= c + b'c' # c + c = c.
= c + (b' + c') # b'c' = b' + c' (DeMorgan's Theorem).
= 1 + b' # c + c' = 1.
= 1 # 1 + b' = 1
This feels absolutely wrong to me, and the next question asks me to make the logic circuit for it, which I don't think is possible.
Can anyone help/walk me through what I am doing wrong? I would really appreciate it. :(
(P.S. I used code formatting, I apologize if this is annoying to some.)
By this table:
A 1 1 1 1 0 0 0 0
B 1 1 0 0 1 1 0 0
C 1 0 1 0 1 0 1 0
Y 1 0 1 1 1 0 1 0
Y=ab'+c
I've got it :D
f(a,b,c) = abc + ab'c + a'bc + a'b'c + ab'c'
= a(bc + b'c + b'c') + a'(bc + b'c)
= a(c(b + b') + b'c') + a'(c(b + b'))
= a(c * 1 + b'c') + a'(c * 1)
= a(c + b'c') + a'c
= a(c'(b'c')')' + a'c
= a(c'(b + c))' + a'c
= a(c'b +cc')' + a'c
= a(c'b)' + a'c
= a(c+b') + a'c
= ac + ab' + a'c
= c(a + a') + ab'
= ab' + c