1 Algorithm Definition
-
|
Finite set of available instructions
|
-
|
Made of finite instructions
|
-
|
Discrete steps taking finite time
|
-
|
Each step depends only on the previous ones
|
-
|
No limit to steps or space during execution
|
2 Turing Machine
A Turing Machine is
|
Set of states, halting state
|
|
Set of symbols
|
|
Initial state
|
|
is the transition function
|
if
|
then
|
is the free monoid generated from . is the associative string concatenation operator. is the empty string and identity element.
2.1 Configuration
A configuration of a TM is a quadruple: where .
is the currently selected symbol under the head of the machine. Short notation is or when position is not needed.
2.2 Computations
A computation is a finite succession of computation steps
|
Computation step
|
|
Reflexive and transitive closure of comp. steps.
|
|
Machine computes in steps.
|
|
Converges and diverges
|
2.3 T(uring)-Computability
Let be alphabets, let and .
Let be a function .
is turing computable by a machine
sectionProblems and Functions
2.4 Poly reduction
Then exsit for some k
2.5 Total Function
, subset of is total
|
is defined everywhere
|
|
uniqueness
|
2.6 Partial Function
, subset of is a partial function
|
uniqueness
|
at least a
|
not required to be defined everywhere
|
Theorem
Computable functions are . Total computable functions are . There exists functions that are not computable.
Proof
▼
Computable functions can be given a Gödel numbering. Therefore there exists a 1-1 mapping with natural numbers. All the possible total functions are .
Theorem
Padding Lemma: every computable function has indexes. infinite set of indexes such that
Proof
▼
If a program in WHILE computes , one can build other programs that compute the same thing, semantically equal to by just adding a skip command at the end of the program. The new program’s index will change because an effective numbering depends only on the syntactic description of the program.
Theorem
The class of Primitive Recursive functions (and extensions) does not contain all the effectively computable functions:
Proof
▼
reductio ad absurdum
1: Encode and enumerate p.r. functions
|
. There are
|
2: Define diagonal func.
|
Intuitively computable
|
3:
|
doesn’t appear in list
|
Implies there is no formalism for expressing only and all total functions.
Recursion Axiom
Let be a p.r. func. in variables, be a p.r. func in variables. The function in variables defined as follow is then p.r.
This represents a FOR loop. , the first variable of is the decrementing counter. is the final step. represents loop steps. This is guaranteed to terminate. One can define basic maths primitives with p.r. functions.
Theorem
Enumeration:
Proof
▼
Guarantees that a formalism that expresses all computable function has an universal algorithm. Therefore there exists an Universal Turing Machine.
3 Turing Machines
Deterministic TMs are seen in the Computability Theory Cheat Sheet.
3.1 K-tapes Turing Machines
Let be the number of tapes. A k-tape TM is a quadruple . Special symbols , while . Since we are only considering decision problems, are the halting states. The transition function differs but is subject to the same restrictions as a classical TM.
3.2 IO Turing Machines
Are useful to study space complexity. Any k-tape TM is of IO type is subject to the following constraints. Consider
|
First tape is read-only
|
or
|
Last tape is write-only
|
|
Input tape is right-bounded
|
3.3 Nondeterministic Turing Machines
A nondeterministic TM is a quadruple where are the same as in other turing machines. They can be of IO and k-tape type. The difference rilies in the fact that
Is not a transition function but a transition relation. The same syntax as for other TMs is used for nondet. TMs configurations and computations. A nondeterministic TM decides if and only if there exists at least a computation such that
The degree of nondeterminism of a NDTM can now be defined as , where
4 Reductions
A (decision) problem (a set) reduces to with reduction , in symbols when . We also have that because .
4.1 Reduction Relations
A relation of reductions () can be defined on a class of functions as follows, note that is also a partial pre-order.
4.2 Properties
Let and two classes of problems such that , and . A reduction relation̋_F classifies and (problems):
1: Reflexivity
|
|
2: Transitivity
|
|
3: closed by reduction
|
|
4: closed by reduction
|
|
Equivalently:
1: Identity
|
|
2: Closed by composition
|
|
3:
|
|
4:
|
|
4.3 Hard and Complete Problems
If classifies and , then problems .
-
|
|
-
|
is -hard for if
|
-
|
is -complete for if
|
Theorem
If classifies and , and is complete for then
Proof
▼
is obvious. : Let and . By completeness, and , then which implies
Theorem
Completeness is "transitive" for reductions: If is complete for , and then is complete for .
Boolean Circuit
Definition
A Boolean circuit C “computes” a (finite) function over De Morgan basis
A circuit ensembles is an infinite family of circuits . has n input bits, has m bit output, if each has output bits (i.e., m is a function from N to N, might be constant, e.g., m(n) = 1). Circuit ensemble can handle any input length, i.e., compute any (even infinite) functions, but the description size is infinite!.
Theorem
(Shanon) Every can be decided by a (De Morgan) circuit of size
Regular language
Lemma
(pumping lemma): for every regular language , there exists (a.k.a. pumping length) s.t. every with can be written as with with
5 Classical Results
Theorem
The class of Primitive Recursive functions (and extensions) does not contain all the effectively computable functions:
Proof
▼
reductio ad absurdum
1: Encode and enumerate p.r. functions
|
. There are
|
2: Define diagonal func.
|
Intuitively computable
|
3:
|
doesn’t appear in list
|
Implies there is no formalism for expressing only and all total functions.
Resulting regular expression:
is regular if regular T. built NFA ,
is regular if regular F. take r = r1* r2(r4+r3r1*r2)* where: r1: initial → initial r2: initial → accepting r3: accepting → initial r4: accepting → accepting
6 Complexity Classes
6.1 Hierarchy
Since , at least one of those inclusions is strict:
It is not yet know which one of those is a strict inclusion.
Theorem. . Proof. Let . There a TM that computes in space. can range through configurations. A halting computation cannot cycle on configurations. So M computes in for some .
Theorem. (Savitch) .
Theorem. Hierarchy of time and space. If is appropriate, then , and . Corollary.
Proof. is obvious because grows faster than any polynomial. It is strict because . This corollary, together with the fact that lets us conclude that .
Theorem. Hierarchy 2. Let be an appropriate measuring function, and a constant. Then
Resulting regular expression:
(Example) is regular.onsider some DFA for . For every , let be the set of states such that there is *some* word of length which leads the DFA from the initial state to . Let be the set of states such that there is *some* word of length which leads the DFA from to an accepting state. Finally, for any two states , let be the (regular) set of words leading the DFA from to . We have
Since there are only finitely many possibilities for , the union is in fact finite, and so regular
7 Theorems on Complexity of Turing Machines
Theorem
Reduction of tapes Let be a -tape TM, deciding in deterministic time , then a 1 tape TM that decides in time
Proof
▼
(Only a draft): Build a 1 tape TM by introducing two symbols and to denote the start and end of the -th tape. Introduce new symbols to remember each tape’s head location. To generate the initial configuration , states and steps are needed. To simulate a move of , iterates the input datum from left to right, and back, 2 times: find the marked symbols, the second time write the new symbols. If a tape has to be extended, the parens have to be moved and a cascade happens. This takes time, for each move of . Since takes time to compute an answer, will take
Corollary
Parallel machines are polinomially faster than sequential machines.
Theorem. Linear speedup If , then . Proof. (Draft): Build from solving , compacting symbols of into 1 of , in function of . The states of will be triples meaning the TM is in state and has cursor on -th symbol of . then needs 6 steps to simulate steps of . Therefore, will take . Then . □
Same principle can apply to with linear space compression. If , then
Theorem
For each k-tape TM that decides I in deterministic time there exists an IO TM with k+2 tapes that computes I in time for some constant .
Proof
▼
copies the first tape to the second tape, in steps. Then, operates identically to . If and when halts, copies the result to the tape , in at most steps. computation was steps long.
Theorem
Exponential loss in determinization, or bruteforce If and is computed by -tape , it can also be computed by a deterministic TM with tapes in time with an exponential loss of performance. In short,
Proof
▼
Let be the degree of nondeterminism of . sort lexicographically. Every computation of length carried by is a sequence of choices that can be seen as a sequence of natural numbers with . The det. TM considers these successions in an increasing order, visiting the tree of nondeterministic computations, one at a time. Therefore , also, all the possible simulations can be .
Space complexity
STCON,PATH is NL-complete under log reductions.
NL clousre If the language is -complete then
1. 2. Each problem in is logspace reducible to
Since each problem in is logspace reducible to as well. Thus is -complete.
Any have Log Verifier for is D-TM with 3 types ,working type) run in Poly time and using types from working type. if exsit s.t accept . if for all reject. exsit Right only log time verifier.
Composition of space-bounded functions
is a bit tricky: Thm: if and are computable in space respectively, then is computable in space (with being a bound on the output length of ).
Randomized
Theorem
The class RP consists of all languages L that have a polynomial-time randomized algorithm A with the following behavior:
if then A accept in prob at most . and if A always reject
Theorem
The class BPP consists of all languages L that have a polynomial-time randomized algorithm A with the following behavior:
if then A accept in prob . and if A accept in prob
Theorem
(random algo) Given n digits strings and then
Proof
▼
supose Let when . then , then
Theorem
(Adelman’s Theorem) Every can be decided by a family of poly-size circuts
consequences
If then you have a collapse to the first level, namely . Assume , since is closed under complement we have , thus .
Let be a language in , then there exists a probabilistic polynomial time Turing machine with access to a oracle , call it , such that:
, and
.
Where is the running time of (in the standard definition, the second inequality appears with , however it can be improved to and thus to the above).
Suppose that the Turing machine which decides has expected runtime . Now, let us look at the machine , which replaces each oracle call of by a steps simulation of the machine for , and repeats this simulation times. If we come upon an oracle call, for which this process did not result in an answer (none of the -steps simulations halted), then rejects.
Let denote the event that all of the simulations halted in the given time, then:
.
Thus, we have:
, and
.
By Markov’s inequality, a single simulation does not halt with probability , so -steps simulations do not output an answer with probability . By the union bound we have , for and . In this case acheives the following separation:
, and
Which proves , since we can replace the constant with any function .
change probability in definition of PP
Lets say a separation using exists for if there exists a polynomial probabilistic Turing machine such that:
iff there exists a separation for using (we can use a strong inequality in both cases). We proceed to show that , if there exists a separation for using , then there exists a separation using . Now, given a machine with separation for , lets consider a machine , which starts by tossing an -biased coin. If the outcome is then it accepts, otherwise it simulates on the input and returns it’s answer.
Since we want to be polynomial in the worst case, we restrict the simulation of the -biased coin to a certain number of iterations, such that the simulation halts with probability (this can be done for any , simply use the fact that the simulation runs in expected constant time, and apply Markov’s inequality). If the simulation didn’t halt, accepts. In this case we have:
M_q=ε+(1-ε)α+(1-α)M_p, so:
M_q> ε+(1-ε)α+(1-α)p
M_q< ε+(1-ε)α+(1-α)p This means it is enough to require . Equivalently, .
Since , if we can find a small rational and rational to satisfy this. If you can change to reject when the outcome of the -biased coin is 1. When this equality holds, achieves separation for and we are done.
CircuitSat DTIME consequences
does not imply since the reduction might increase the input’s size. Suppose for example that and the reduction from to maps strings of length to strings of length (for all ), then applying the reduction and using a naive exponential time algorithm for circuit sat yields a time algorithm for .
If however you can place an NP-complete problem in a subexponential time class, e.g. , then , since is closed under polynomial transformations, i.e. .
P/poly SIZE consequences
If NEXPTIME then NEXPTIME = EXPTIME. Proof that Let L be a language in BPP, and let be a polynomial-time algorithm that decides with error(where x is the input string and r is a set of random bits).
Construct a new machine , which runs times and takes a majority vote of the results (where is the input length and R is a sequence of independently random ). Thus, is also polynomial-time, and has an error probability by the (chernof). If we can fix then we obtain an algorithm that is deterministic.
If is defined as, we have:
:>
The input size is ”n”, so there are 2<sup>”n”</sup> possible inputs. Thus, by the [[union bound]], the probability that a random ”R” is bad for at least one input ”x” is
:
In words, the probability that ”R” is bad for some ”x” is less than 1, therefore there must be an ”R” that is good for all ”x”. Take such an ”R” to be the advice string in our ”’P/poly”’ algorithm.
Prove that PP is closed under complement
algorithm A with the property that and .
> Let be the > polynomial upper bound on the running time of A on input x. Thus A > makes at most random coin flips during its execution. In > particular, .
I understand that:
- By definition of PP, . It means that if indeed , more than half of the possible coin flip combinations would cause the algorithm to accept. - The algorithm’s running time is bound by , then the probability of getting some specific combination of coin tosses (that accepts or rejects, we do not know) is bound by .
But why is it so obvious that
This justifies the assumption (since A is still a polynomial-time probabilistic algorithm) and completes the proof.
Theorem
(extended rice) Let s.t then
Proof
▼
and Let be TM that accept some . Define ,
On input : Emulate and in parallel, and accept if one of them does.
Rice extended(not the version we saw)
theorem Given a property , the language
is recursively-enumerable () if and only if all the following three statements jointly hold
1. For any two , if and also then also .
2. If then there exists a finite subset so that .
3. The set of all *finite* languages in is enumerable (in other words: there is a TM that enumerates all the finite languages ).
Proof
▼
If (1,2) hold, but (3) does not, then *. Let’s assume that , and we’ll see that we have a way to accept any finite languages in (and thus, the set of all these languages is RE), thus condition (3) holds and we reach a contradiction. How to decide if a finite belongs to or not? Easily – we use the description of to construct a machine that accepts only the words in , and now we run the machine of on (remember - we assumed , so there is a machine that accepts !). If then and since , its machine will say yes on the input , and we are done.
If (2,3) hold, but (1) does not, then *. We assume that and we’ll show that we have a way to decide , leading to a contradiction.
Because condition (1) doesn’t hold, there is a language and a superset of it, so that . Now we are going to repeat the argument used in Section 4 to decide : given an input for , we construct a machine whose language is if or otherwise, its language is . Then, we can decide : either halts on , or the RE-machine for accepts ; we can run both in parallel and are guaranteed that at least one will halt.
Let’s give the details of constructing (on input ):
1. Do the following in parallel: 1.1 run on . 1.2 run the machine of on 2. If 1.2 halts and accepts - accept. 3. If 1.1 halts: run the machine of on .
Why does this work? If then 1.1 never halts, and accepts exactly all the inputs that are being accepted at step 1.2, so . On the other hand, if then, at some point step 1.1 halts and accepts exactly . It may happen that accepts beforehand, but since , this doesn’t change the language of in this case.
If (1,3) hold, but (2) does not, then *. Again, we will assume and show that becomes decidable, which is a contradiction.
If condition (2) doesn’t hold, then for any , all its finite subsets satisfy (note that must be infinite, since ). As in the above, in order to decide for a given input , we construct a machine whose language is if and some finite otherwise. The contradiction follows in a similar way as above.
The construction of this machine is quite similar to the previous we constructed. The machine (on input ) does:
1. Runs on for steps. 2. If halts during step 1 – reject 3. Otherwise, run the machine of on .
It holds that, if , then at some point, say after 1000 steps, halts on . Therefore, step 1 will halt on (and reject) any input of length . Therefore, in this case, is **finite**. Also note that , and in particular, by our assumptions on the invalidity of condition (2), we have that .
On the other hand, if , then step 1 never halts, and we never reject at step 2. In this case it is easy to see that and in particular, .
We are left to show the other direction of the extended theorem. That is, we need to show that if all the conditions (1,2,3) hold, then we have a TM that accepts , that is, . In other words, we need to show a machine so that for any input for which , the machine accepts this input, .
Here is how the machine behaves (on input ):
1. Let be the machine that enumerates all the finite languages in , guaranteed by condition (3). 2. Run the following in parallel for 2.1 Run until it outputs the language 2.2. Check if accepts all the words of (run on these words, again in parallel). 2.3. If for some , accepts all the words of – accept.
Why does it work? If then it has a finite subset , and once outputs that subset, step 2.2/2.3 will find that accepts all the words in that language and accept.
On the other hand, if it cannot be accepting all the words in for any . Indeed, by condition (1), any is also in , so if accepts all the words in for some , then and thus , in contradiction.
Given a non trivial property , so that , the language >
> is not recursively-enumerable, that is, .
Prove/ Disprove
claim
For language L s.t L satisfies pumping lemma with pumping constant ,
satisfies pumping lemma with pumping constant
Proposition
1
is not closed under complement
Proposition
2
is closed under intersection.
Proposition
3
is closed under Kleene star.
Proposition
4
is closed under Prefix, but is not closed under Prefix,
Prove: If and then
Disprove: If and then
Disprove: If then
Disprove: For every , then or
Prove: If is regular, then
For every nontrivial TRUE
For every nontrivial Equivalent to an Open Problem
There exists a language in that is complete w.r.t polynomial-time reductions. TRUE.
If there exists a deterministic TM that decides SAT in time
Then every is decidable by a deterministic TM in time . TRUE.
8 Determine if in
Input: Sets , and a number .
Question: do there exist mutually disjoint sets not in
Input: Sets .
Question: do there exist 3 mutually disjoint sets in
Input: Sets , and a number .
Question: do there exist sets such that such that not in
Input: a CNF formula with at most 50 variables.
Question: does there exist an assignment that satisfies in
Input: a CNF formula with at most 50 clauses.
Question: does there exist an assignment that satisfies in
Input: a 3CNF formula with even number of variables.
Question: does there exist an assignment that satisfies and gives for exactly one half of the variables? not in
Input: undirected graph , a number .
Question: does there exist in a clique of size at least or an independent set of size at least ? not in
Resulting regular expression:
is regular if regular T. built NFA ,
is regular if regular F. take r = r1* r2(r4+r3r1*r2)* where: r1: initial → initial r2: initial → accepting r3: accepting → initial r4: accepting → accepting
(Example) is regular.onsider some DFA for . For every , let be the set of states such that there is *some* word of length which leads the DFA from the initial state to . Let be the set of states such that there is *some* word of length which leads the DFA from to an accepting state. Finally, for any two states , let be the (regular) set of words leading the DFA from to . We have
Since there are only finitely many possibilities for , the union is in fact finite, and so regular.
8.1 Examples
is a TM and
are TMs and
are TMs and
is Regular
is not regular
is not regular
is not regular
is a TM and there exists an input on which M halts in less than steps
is a TM and co-RE reduce . idea copies M and x to its tape, and runs M on x; it accepts if M halts on x.
M is a TM and RE reduce . It erases w, puts M and x on its tape, and runs M on x and accepts if M halts on x.
is a TM that accepts all even numbers not RE
is a TM and is finite not RE
is a TM and is infinite not RE
is a TM and is countable R This is the language of all TM’s, since there are no uncountable languages ( finite alphabets and finite-length strings).
is a TM and is uncountable R This is the empty set
are TM’s and RE. reduce HALT s.t halts on accepts all strings, and in particular it accepts .
are TM’s and RE.
are TM’s and not RE.
s.t
is a TM. and halt on all input, and RE using Rice
is a TM. and halt on all input, and R
is a TM. and is str and there exists a TM, , such that R
is a TM. , and there exists an input whose length is less than 100, on which M halts RE
is a TM, and at least TM’s of halt on x RE its reduce
is a TM and M halts on all palindromes .not RE . on input x works as follows: runs M on w for steps: if M halts on w within steps, reject, otherwise accept.(if is polindrome)
is a TM and is empty.not RE on input x: erase x, and run M on w. If M halts on w then accepts.
is a TM that halts on all inputs and for some undecidable language .R
is a TM and M accepts (at least) two strings of different lengths .RE
M is a TM such that both and are infinite .not RE M’ on input x works as follows: if x is of odd length, accept. if x is of even length, run M on w. If M halts, accept.
does not halt on accept all odd length str and reject all even. contain even-length strings,
is a TM, and is prime.not RE on input x: if x is one of the first three strings (in lexicographic order) of , then accept. Otherwise, run M on w. If M halts on w, and x is the 4 th string in accept; otherwise, reject.
and exists s.t for every not RE on input x: it runs M on w; if M halts on w, accepts.
and exists s.t for every Recursive (the language of all Turing machines)
is TM and exists s.t and Recursive(the language of all Turing machines)
and not RE s.t . on input : run M on if M halts, check whether y is of the form where is TM and z is str. if so run on z. if halts, accept . hence (1)if (2) otherwize then
M does not accept any string w such that 001 is a prefix of not RE
M does not accept any string w s x is a prefix of not RE
x is prefix of Recurcive
not RE
not RE M1 on x: runs M on w; if M halts, M1 accepts.M2 on y: reject. M3 on z: reject.
there exist two TM’s M2 and M3 such that Recursive
M is a TM that accepts w using at most squares of its tapeRecursive Let m be the number of states in M, and k be the size of the alphabet that M uses, and if M uses at most squares of its tape, then there are at most configurations.If M runs on w for more than steps, and does not use more than squares of its tape, then M must be in the one configuration at least twice.
runs M on w for at most a + 1 steps.If M accepts w within a steps with using at most squares, halts and accepts
Wich one is False for sure
Wich one we dont know if it close under klee star
Let be computable with circut size we can construct circuit size with only AND,NOT gate for sure.
(mybe)
is ( always true) complete
If we find that then
-
G is an directed graph, and the shortest patch between s,t|V|/2 }NL G is an directed graph, and exsit patch between s,t|V|/2 }NPC If then every clause of ϕ} is NPC is NPC is regular if Recursive is regular regular if Recursive.
stand for respect under logspace reduction
CVAL=complete
STCON=complete
UHAMPATH=
UHAMCYCLE=
SUBSET-SUM=
Theorem
(extended rice) Let s.t then
Proof
▼
and Let be TM that accept some . Define ,
On input : Emulate and in parallel, and accept if one of them does.
2-NFA with same L
to compare languages accepted by both we have to figure out if L(A) is equal to L(B) or not. Thus, you have to find out if L(A)-L(B) and L(B)-L(A) is null set or not. (Reason1)
Part1: To find this out, we construct NFA X from NFA A and NFA B, . If X is empty set then L(A) = L(B) else L(A) != L(B). (Reason2)
Part2: Now, we have to find out an efficient way of proving or disproving X is empty set. When will be X empty as DFA or NFA? Answer: X will be empty when there is no path leading from starting state to any of the final state of X. We can use BFS or DFS for this.
We prove by contradiction that L is undecidable. Assume L was decidable, and let be a TM that decides it. We construct a TM, M0 , that decides HP (the halting problem), i.e., we show HP is decidable as well, which is a contradiction. The TM M0 on input hM, wi: i. build a TM, M00 , from M, where M00 shifts w one tape cell to the right, and mark the leftmost tape cell with ]. ii. TM M00 runs M on w. iii. If M00 encounters the ] symbol during the run of M on w, then M00 moves to the right and simulates M reaching the leftmost tape cell. iv. If M halts and accepts, M00 simulates moving its head all the way to the left and off the leftmost tape cell The TM M0 runs on the input hM0 , wi. If accepts, M0 accepts; otherwise, M0 rejects. Notice that always halts (accepts or rejects) since we assumed it is a decider. You can prove now that M00 moves its head off of the leftmost tape cell iff M halts (and accepts) w. It then follows that M0 decides HP, and hence HP is decidable, which is a contradiction. Therfore, the language L is not decidable.
The trick to proving this language is decidable is to notice that M moves its head left on an input w if and only if it does so within the first steps of its run on w, where is the number of states of the machine M. Prove it! Therefore, to decide whether an input hM, wi is an instance of L, the decider simply runs M on w for at most steps and accepts iff M does moves its head left (the correctness of this construction follows immediately after you prove the validity of the above trick)
is a bound on the number of k cell configurations of M. Any space bounded Turing machine also runs in time . Is this because a configuration consists of a state, a position of the head and the contents of the work tape, which is If there are possible configurations, then any such Turing machine either runs in time at most , or it loops forever. That’s because if it runs for at least time steps, some configuration must be repeated; and if a configuration is repeated, it will continue repeating forever.
Let , i.e., . Then . For similar reasons, . So, the number of possible configurations of a -space-bounded Turing machine is , and thus any such machine must either halt in time at most or loop forever.
is undecidable.
{ = We reduce from the halting problem. Suppose that we are given a machine , and we are to decide whether halts on the empty input. We construct a new machine accepting a single input , which operates as follows:
1. Let . 2. runs for steps. 3. If halted within steps then runs a dummy loop taking exponential time . Otherwise, just halts.
Since Turing machines can be simulated with only polynomial overhead, if doesn’t halt then runs in polynomial time. If halts, then takes exponential time. Hence halts iff is not polynomial time.
—
More generally, this shows that even if we know that runs in time at most for some superpolynomial time-constructible , then we cannot decide whether runs in polynomial time.
Proof
▼
with clauses, produce the graph that contains a triangle for each clause, with vertices of the triangle labeled by the literals of the clause. Add an edge between any two complementary literals from different triangles. Finally, set . In our example, we have triangles on and on plus the edges and .
>We need to prove two directions. First, if is satisfiable, then has an independent set of size at least . Secondly, if has an independent set of size at least , then is satisfiable. (Note that the latter is the contrapositive of the implication "if is not satisfiable, then does not have an independent set of size at least k".)
>For the first direction, consider a satisfying assignment for . Take one true literal from every clause, and put the corresponding graph vertex into a set . Observe that is an independent set of size (where is the number of clauses in ).
>For the other direction, take an independent set of size in . Observe that contains exactly one vertex from each triangle (clause) , and that does not contain any conflicting pair of literals (such as and , since any such pair of conflicting literals are connected by an edge in ). Hence, we can assign the value True to all the literals corresponding with the vertices in the set , and thereby satisfy the formula .
>This reduction is polynomial in time because
I’ve looked at many different examples of how this is done, and everything I find online includes everything in the proof except the argument of why this is polynomial. I presume it’s being left out because it’s trivial, but that doesn’t help me when I’m trying to learn how to explain such things.
(cook-levin)
a. The witness is a word that M accepts in at most n steps b. and let V be a verifier for of running time at most p for some polynomial p. For a string w, let be the TM that on input c, emulates . Consider the reduction It is easy to verify that f is polytime computable, and that
3-SAT: Given a CNF formula , where every clause in has *exactly* 3 literals in it, one should determine if there exist an assignment that satisfies it.
Max-2-SAT: Given a CNF formula, where every clause in has *exactly* 2 literals in it, and a positive number , one should determine if there exist an assignment that satisfies *at least* clauses.
Proof
▼
Replace a singleton clause with and many copies of , . Reductions without extra variables don’t work even when singleton clauses are allowed. There are five types of clauses: , , . Suppose that of are true. The probability that a random clause of each type is true is:
We would like to take a convex combination of these rows which is of the form
However, all such convex combinations have .
Proof
▼
Let be a new variable not in . Let there be clauses in . We add and copies of to . In every exactly 1/3-satisfying assignment must be false.
Let copies . The following claim is easy to prove.
Claim: is satisfiable iff has exactly 1/3rd satisfiable clauses.
Therfore 3SAT 1/3-CNFSAT, and hence 1/3-CNFSAT is NP-complete.