Theory of Computation III
Theory of Computation III CS 6800
Popular in Course
verified elite notetaker
verified elite notetaker
ANSC 221: Animal health and Nutrition
verified elite notetaker
verified elite notetaker
verified elite notetaker
verified elite notetaker
Popular in ComputerScienence
This 190 page Class Notes was uploaded by Lisette Hodkiewicz on Wednesday September 30, 2015. The Class Notes belongs to CS 6800 at Western Michigan University taught by Elise Dedoncker in Fall. Since its upload, it has received 71 views. For similar materials see /class/216886/cs-6800-western-michigan-university in ComputerScienence at Western Michigan University.
Reviews for Theory of Computation III
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/30/15
IVIUIICIIIICU IVI LI VVO II mohamedelwakilnet Agenda Preliminaries Ranked all habet TermsSubTerms Functions Substitutions Trees Finite Tree Automata Nondeterministic Finite Tree Automata NFTA Complete NFTA Reduced NFTA Closure Properties FTA The re 5 Tree Automata gchnmes and Applications by Hubert Comon Max Dauche R39emi Gilleron Florent Jacquemard Denis Lugiez Sophie Tison and Marc Tommasi httpwwwgrappauniv lille3frtata 2 Ranked Alphabet N The set of positive integers N A finite string over N Ranked alphabet a couple F Arity F a finite set of symbols Arity Mapping from F to N Ranked Alphabet N The set of positive integers N A finite string over N Ranked alphabet a couple F Arity F a finite set of symbols Arity Mapping from F to N Ex F cons2 aO nilO Shortcut Fcons a nil Ranked Alphabet N The set of positive integers N A finite string over N Ranked alphabet a couple F Arity F a finite set of symbols Arity Mapping from F to N EX F cons2 a0 ni0 Shortcut Fcons a nil Elements with arity p are called Fo EX FOnila F1 F2cons Ranked Alphabet N The set of positive Integers N A finite string over N Ranked alphabet a couple F Arity F a finite set of symbols Arity Mapping from F to N Ex F cons2 a0 nil0 Shortcut Fcons a nil Elements with arity p are called Fp Ex FOnila F1 F2cons Elements in F0 are called constant symbols Elements in F1 are called unary symbols Elements in F2 are called binary symbols Elements in Fp are called pary symbols Variables and Terms Variables constants ie O ary Ex Xxy Variables and Terms Variables constants ie O ary Ex Xxy Let Fcons a nil and Xxy consanil is a ground term No variables Variables and Terms Variables constants ie O ary Ex Xxy Let Fcons a nil and Xxy consanil is a ground term No variables consax is not a ground term Variables and Terms Variables constants ie O ary Ex Xxy Let Fcons a nil and Xxy consanil is a ground term No variables consax is not a ground term consxy and consanil are linear terms 10 Variables and Terms Variables constants ie O ary Ex Xxy Let Fcons a nil and Xxy consanil is a ground term No variables consax is not a ground term consxy and consanil are linear terms consconsanila is not a linear term 11 Variables and Terms Variables constants ie Oary Ex Xxy Let Fcons a nil and Xxy consanil is a ground term No variables consax is not a ground term consxy and consanil are linear terms consconsanila is not a linear term consaconsxnil can be represented as Variables and Terms Variables constants ie Oary Ex Xxy Let Fcons a nil and Xxy consanil is a ground term No variables consax is not a ground term consxy and consanil are linear terms consconsanila is not a linear term consaconsxnil can be represented as TFX set of terms over ranked alphabet F and set of variables X EX xyanil consanilconsxy e TFX 13 Positions Post the position of term t PosteN Ex Posni22 Posa1 14 Positions Post the position of term t Poste N Ex Posni22 Posa1 FPost the set of frontier positions Ex FPoscons1 2122 Positions Post the position of term t Poste N Ex Posni22 Posa1 FPost the set of frontier positions Ex FPoscons1 2122 VPos the set of variables positions Ex VPoscons21 Tree A finite ordered tree t over a set of labels E is a mapping from set PostgN into E Tree A finite ordered tree t over a set of labels E is a mapping from set PostgN into E In other words a term teTFX is a finite ordered tree with the leaves labeled with variables or constant symbols and the internal nodes are labeled with symbols of positive arity with out degree equal to the arity of the label 18 Tree Example Let Ff g h a b and Xxy vuumummummuummuumummummmn Imummumwmmmm mum munummumm n H Al y Aquot 11 1 l J 4 y y 4 H 1 ann 4H1l HlHll uynxxnnxuyn y 1 1 x V 1 x l1 1 H N H Nquot H l V y ll lquot l t b Al y 4quot NHL My 1 MIN 19 SubTerms A subterm t p of term teTFX is defined as PostpJ pjePOSt quPost p t p0tool 20 SubTerms A subterm t of term teTFX is defined as Post In other word i position Ex t1ga JpJePos t quPost t qtpq b t3h0 mummmmmrmuurmuummmimmmm mmmmnmmmmmunmumm nmuuummmim n the subtree starting at wilimmm4NuiurNiuiuilimmwwyiliii wiimwiwivuminuuimmmumyu mmuuumm u Aimviwitimnuuuuuiiinuuumyimmymimy yimimivmnwwuuuuummyimi ilililiilili4ilililiil i u m mumHumuuummmwiwmm yimimimimimwuu mmmlmww mummmmu i iiiliiiliiiliiimi il1rIiiiHIl1ilililiiiiimisiyimii miliiibiimmmmm unmmmmmmi mmmmnmm anmmmummym wummwumimiuummnmunm iiiliiiii iliiiliiil i wunnummum u w Himimimimil1iil1killil1iiInMililiii mmsiyi mummumimmignmmmimm wuvinumumm w Functions t is the size of term t Heightt is the height of term t 22 Functions t is the size of term t Heightt is the height of term t ifteX then t Heightt0 ifteF0 t Heightt1 23 Functions t is the size of term t Heightt is the height of term t ifteX then t Heightt0 ifteF0 t Heightt1 HI 1Zie1n HI Heightt1maxHeightti ie1n 24 Functions t is the size of term t Heightt is the height of term t ifteX then I t Height tO ifteFO t Height t1 t I1Zie1n HI Height 1max Height ti ie1 t Height t u 1mmuummmimymm mmmmmumnnmummummym Wmmumunmmmmnwmm immwmnImuuuml i i39 m 1 l litmvwm mmniumnnnuuuuuunnmnwunmmu mumtmmumnrmummummm mmumtmmim u iwimilmtmlmunWmuulmitmlmlmmm minimummmquotmummy mummmmmu i ummimmumirmmminmmmnmmnm minimummmMummy wnuummmn i mmumtmlmnnnumum mmimummmyi mlmlmlmtml1mmlmmlmmmt minimumimw mumuuwmmm q u umimmm mmuummnmymmmm wuumnminimummmunnmmm mumumnmmm u 25 Functions t is the size of term t Heightt is the height of term t ifteX then I t Height tO ifteFO t Height t1 Itl I12id1 n I t Inn HEight 1max Height ti ie1 I t I I i I t I I wummm u munmuumnmmymmm mmmmmuuummmm mmmwmm mu v u I in tin mmmvmmmmunmnmmvmymm mmumtmmmmvmnmv wumnmummvnmmumu umnmmnm immnmmm umuummmnuhnmumitmmmnw mymmmtmunummimm WW I g ummimmtmu mmuummmummmmvmn AimwmlmmiIinmmmmlmmm wumnmmum w mnmmImminthrmnminmmmmmim Wymvmmmnmmmnmmumm mnmmvmm i nmummummum nIinurmummmwizm mililiiililimliVIMivllmimilmi ihmmmumm w rvvmvu u u m u mmuwmmmirmnmuvummm ummmtwmmm u Hliil w my v v u m n u mmiuummmnmymurmnmmwmmm unmmvmmmmmmmmmvmn mmmmmmiulvlmummnwmv xmmumymmmn wmmmmmm u u 26 Substitutions A substitution is a mapping from X into TFX Example LEt Fflll gll a b xX11 X2 tfX11X11X2 mmmm wwwmmumwwmmmnm iiiiiiliiiiiiinumilmimitiyllw munitth mm mm iiimilimwminimmu mm inm inmmymh mmmw unyith mumummmmy imitmniiwl mmuumummymmm immtmnmmuhmm wth ismnmmnuwm mum mm mmImnummmm Hmnmmnmmulwmm mum uuuiumummm WWW um um mmnmmunwmummymymm mmmmrmmum ulmmmm mmunummlm mm A x numunmmnmmy unimmmuulmumgum i i mm mm 39 Minimum iiiii iiit 27 Substitutions A substitution is a mapping from X into TFX Example LEt Fflll gll a b xX11 X2 tfX11X11X2 Applying substitution ox1 a x2 gbb tofaagbb nmmmumn mm m 1 mm ummuummummymgm 39 WWiiimiuWWW m iii nmummn m IWWWWWWme unumimwu mm 2 unumimwu i m ummummuuummnum uv x n 28 Substitutions A substitution is a mapping from X into TFX Example LEt n gll a b xX11 X2 tfX11X11X2 Applying substitution ox1 a xz g bb tofaagbb mnnumumnnnnmmumnmmmmym minimummmmughrmummummimm mmmumi wmumm mimumuu WWIiiiiiiiiiiwm iiiiiliiiimlvi min i iiiiiliiiimlvi uumlmmuu um uumlmmuu y xummummummimmn m m mmummmmxmgmm m Finite Tree Automata A finite tree automata FTA over F is a tuple AQ FQfA Q a set of states Qng a set of final states A is a set of transition rules Set of substitutions There is NO explicit initial state 30 Finite Tree Automata A finite tree automata FTA over F is a tuple Q a set of states Qng a set of final states A is a set of transition rules Set of substitutions There is NO explicit initial state Move relation 9A is applying one transition rule on a tree t to obtain tree t 39 9A is a reflexive and transitive closure of 9A 31 IbHIb IbMO IbHIb obMO 39I Ob391b10 obHob obMO 39Ob 0b IbPU obe b obmue IbHIb Ibwue obe0b obpue obHIbhou 39Ib 0blOU Ibet obeo V IbD Ib obD 19m Hons V 403 39Dv VLl Japlsuoa 39I o 10U PU J0d 191 g aldwexa VL Acceptance A ground term t is accepted by a finite tree automaton AQ FQfA if t 9Aq t for some state q in Of In other words if the symbol at the root belongs to Of 33 Acceptance g A ground term t is accepted by a finite tree automaton AQFQfA if t 9Aqt for some state q in Qf In other words if the symbol at the root belongs to Qf A run r is the steps carried out to reach qt Transitions in in FTA can be topdown or bottom up Bottomup is more powerful than topdown 34 Example v y m m 3 n I my Tv J I L I W I m km M m y 5 Y y I y M I Maw 15226 35 36 m m m u urn m l Id l illl l d m m um 13 m k m 0 r mm mm m E gnu um 1 Wm W W 1 qu Y Wm W H mm mu Mm l Wm m w W nme m W E wm l W n Wm I Example WWW wwwwwww mwmwwwwwwww www n wwlw Wwm WWW wnwwwwm mmummwwnumqmm mmw mwnltww wwwu WW u 39 qwa ww 39 m m m amacmmwgunn zbenqmw 3 Tkwukw ywquwwnynp saxyum m m m WWWWW m m ummm 37 Example mm zamq y a Wm W g mm ammuqmmw mm wquqm u g uquqnm quot WMWW WW WWW mWmWWWW mmmmamame WW q mmWW my NW N 1mm J mm mm n f JI L G L IM m mun g wwx 5quot aquot 39m w n H n M WWW m m m Hanuympnww rquotwwmu m m m m a n u WW m m W W m m m W H Y39 v m u m n u m m m m m m 38 Example wwwuununuu wquyuguunnu V m N m munnuuvwgww m N m unnuunnmm up mwm u nu m um u mun u nu 1 nu mu um mm m u nu u nunu v nwuwum mm 0 r w um u plump Wm MM 5 u u u qnunynun n u naunuun u u nun u mummy u quotImam Wn mn amm muu n unsuuysuyunuu quotWWW mummuw u u u 1 u I 39 in 39d if g I u 3919 1 1 391 1139 n l I l 535 5 5 lu39 n an n m m nu m m M m m W n w u m m S 0quot m w m WW 39W unnu u ulm N uuuu39 w W gquot may mm quotWm H m my W n u l l u n m m m u I 393 an an n n pun n ndummy unnuunu Twnuue awTwumu l 39 Nondeterministic Finite Tree Automata NFTA A FTA is a NFTA if it allows More than one transition rules with the same left hand side Transitions with state as the left hand side erules Example Let Fga AQFQf A such that Qq0q1q Qfq0 Aa9q0q19q gq09q1 giq qo Igq19q1gq9q1 A NFTA may have more than one run for a ground term 40 Complete NFTA A NFTA A is complete if there is at least one rule fq1qn 9qu for all n20 fe Fn and q1 qneQ Example The Boolean expression evaluation NFTA is complete An incomplete NFTA can be made complete by adding a quotdead state 41 Completing an incomplete NTF Let Fga AQFQf A such that Qq0q1q Qfq0 Agqo9q1 gq9qo E17 1L 6Ib8 vv Jt 00 19m Hons V 40320 v 191 Ob b3 39Ib 0b3V obU b Ib obD 12m qans V 40ka e 8j 191 31 aladwoaug ue Bunadw03 Reduced NFTA A NFTA is reduced if all its states are accessible A state is accessible if there exists a ground term t such that t k Aq 44 Closure Properties A NFTA is closed under union complementation and intersection 45 NFTA is closed under union Let L1 L2 be two recognizable tree languages Hence there are two NFTA A1 and A2 Let A1Q1FQf1A1 and A2Q2FQf2A2 Let L1LA1 L2LA2 Consider AQFQf A such that QQluQ and A A1 U A2 The equality between LA and LAl ULA2 is straightforwa rd 46 NFTA is closed under complementatw Let L be a recognizable tree language Let AQ FQf A be a complete DFTA such that LAL Let ACuucf A such that chq qu and 31 Of Hence AC recognizes the complement of set L NFTA is closed under intersectio L1mL2 L1U L2 Hence NFTA is closed under intersection 48 Theorems 12 g 1 If L is recognizable by a NFTA with a rules then L is recognizable by a NFTA without 8 rules In other words it is possible to remove all erules from a NFTA Determinization Theorems 12 g 1 If L is recognizable by a NFTA with a rules then L is recognizable by a NFTA without 8 rules In other words it is possible to remove all erules from a NFTA Determinization 2 If L is recognizable set of ground terms then there exists a complete NFTA that accepts L In other words it is possible to complete an incomplete NFTA 50 Theorems 22 g 3 Let L be a recognizable set of ground terms then there exists a reduced NFTA that accepts L In other words it is possible to reduce a non reduced NFTA 51 Theorems 22 3 Let L be a recognizable set of ground terms then there exists a reduced NFTA that accepts L In other words it is possible to reduce a nonreduced NFTA 4 Let L be a recognizable set of ground terms then there exists a DFTA that accepts L In other words it is possible to determinize ie convert to deterministic a NFTA Conclusion Finite Tree Automata are powertul but not widely known or used Only two books covers them FTA accepts a ground term if the after zero or more substitutions the root of the tree is in the set of acceptance states FTA are closed under union intersection and complementation A FTA can be completed if not complete A FTA can be reduced if not reduced A NFTA can be converted to a DFTA j w g ratios 0 G G 9 539 93939 Bedanktc J 9 9 G 863 VESQHXHQEITHCGT 54 Finite Tree Automata Mahamed IVL Ell Waki m0hamedelwakilmet Artificial Neural Network Mohamed lVI El Wakil mohamedelwakilnet Agenda Natural Neural Networks Artificial Neural Networks XOR Example Design Issues Applications Conclusion Artificial Neural Networks 0 A technique for solving problems by constructing software that works like our brains Artificial Neural Networks A technique for solving problems by constructing software that works like our brains But how do our brains work The Brain is A massively parallel informmion processing system A HUGE network of processing elements 1 39 A A typical brain contains a netwark of 10 billion neumns A processin element aka A neuron CeH body I f C Synap c Dendrnes A prooessin element aka A neuron Cell body Processor f 539 r 2quot A Synaptic Link a n A Axon Output Dendrites Input A neuron is connected to other neurons through about 10000 synapses How it works 1 Cell body I f O Synaptic Dendrites A neuron receives input from other neurons typically many thousands Inputs are combigned Lesunnned How it works 2 Cell body 2 f f C Synaptic Dendrites Once input exceeds a critical level the neuron discharges a spike an electrical pulse that Egavels omthebodmdowntheaxomtothenadnewcnb How it works 3 Ce body I f O Synap c Dendrnes Theaxonendmgsahno touchthedendr esorce bodyofthenextneuron How it works 3 Cell body I f O Synaptic Dendrites Transmission of an electrical signal from one neuron to the next is effected by neurotransniizittors How it works 3 Cell body I f O Synap c Dendrnes Neurotransmittors are chemicals which are released from the first neuron and which bind it the second How it works 4 Cell body 2 f f C Synaptic Dendrites This link is called a synapse The strength of the signal that reaches the next neuron depeq ls on factors such as the amount of neurotransmittor available Neuron Abstraction I71quot 39tm 39y39 U DIEa d r e 5 l Th rEEhnld Lg 1r Summakiar l 15 So An artificial network is an imitation of a human neural network So An artificial network is an imitation of a human neural network An artificial neuron is an imitation ofa human neuron 17 So An artificial network is an imitation of a human neural network An artificial neuron is an imitation ofa human neuron Let s see an artificial neural network 18 Cell Body In ut quotquot Dendr es 20 uuuu 1 1 gquot 5 AxonSpike Du lm 21 Input I I aquot 3 r 39 n i I m a 39s Iquot Iquot 1 l i a m InputProcessingOutput szl1nr2 xm y 22 Not all inputs are equal I r1 n I I a 39 t h In rpm lluwquot n I I 39 39 l 39 l 392 x1 w 1 y 5 391 i39I Ill I l l Z xlwl x3w2 mx w yE 39m m m 239 le y im Z3 Not all neurons are equal t 39393939 K Activation Function unl 1 39fm Vi m 7 19 x 3 E ximj 12quot M f l 5 quotifquot izl Z4 The signal is not passed down to the next neuron verbatim Input Transfer Function xl weights I I l I l 1 iv 1quotquot rimi I m mei VA 391quot 3 fm izl 25 Three neurons 26 Input of a neuronOutput of other neurons Z7 Layer A set of neurons that receive the same input 28 Three types of layers Input Hidden and Output 29 The output is a function of the input that is affected by the weights and the transfer functions Wei his Activation Function A powerful tool 0 An ANN can compute any computable function by the appropriate selection of the network topology and weights values 31 A powerful tool An ANN can compute any computable function by the appropriate selection ofthe network topology and weights values 0 Also ANNs can learn from experience 32 A powerful tool An ANN can compute any computable function by the appropriate selection ofthe network topology and weights values 0 Also ANNs can learn from experience Specifically by trial and error 33 Learning by trialanderror Continuous process of Trial Evaluate Adjust Learning by trialanderror Continuous process of Processing an input to produce an output ie trial Evaluating this output ie comparing it to the expected output I ll Adjusting the quotprocessing accordingly Learning by trialanderror Continuous process of Processing an input to produce an output ie trial Evaluating this output ie comparing it to the expected output Adjusting the processing accordingly In terms of ANN Compute the output function of a given input Learning by trialanderror Continuous process of Processing an input to produce an output ie trial Evaluating this output ie comparing it to the expected output Adjusting the processing accordingly In terms of ANN Compute the output function of a given input Compare the actual output with the expected one Learning by trialanderror Continuous process of Processing an input to produce an output ie trial Evaluating this output ie com arinu it to the expected output Adjusting the quotprocessing accordingly In terms of ANN Compute the output function of a given input Compare the actual output with the expected one Adjust the weights An ANN that learns how approximate XOR Three Layers An ANN that learns how approximate XOR Three Layers Layer 1 Input Layer with two neurons An ANN that learns how approximate XOR Hidden Three Layers Layer 1 Input Layer with two neurons Layer 2 Hidden Layer with three neurons 41 An ANN that learns how approximate XOR Output Three Layers Layer 1 Input Layer with two neurons Layer 2 Hidden Layer with three neurons Layer 3 Output Layer with one neuron 42 How it works Set initial values ofthe weights randomly Input truth table of the XOR How it works Set initial values ofthe weights randomly Input truth table of the XOR 0 Do Read input eg O and O 44 How it works Set initial values ofthe weights randomly Input truth table of the XOR 0 Do Read input eg O and O Compute an output eg 075343 45 How it works Set initial values ofthe weights randomly 0 Input truth table of the XOR 0 Do Read input eg O and O Compute an output eg 075343 Compute the error the distance between the desired output and the actual output Error 075343 Adjust the weights accordingly 46 How it works Set initial values of the weights randomly Input truth table of the XOR Do Head Input egg 0 and O Compute an output eg 075343 Compare it to the expected ie desired output Diff 075343 Modify the weights accordingy Loop until a condition is met Condition certain number of iterations Condition error threshold 47 InputLia Ef 7 39 Teacher Teaching vnapse FileOutputS yn 39 48 Design Issues Initial weights small random values e 11 Transfer function How the inputs and the weights are combined to produce output Error estimation Weights adjusting aka Training Teaching Number of neurons Data representation Size of training set Transfer Functions Linear The output is proportional to the total weighted input YBn YBX Threshold The output is set at one of two values depending on whether the total weighted input is greater than or less than some threshold value Nonlinear The output varies continuously but not Hneadyastheinputchanges 2 X Y 2 2 e a Sigmoidal 50 Error Estimation RMSE Root Mean Square Error is commonly used or a variation of it DYactuali Ydesired XYdesired 2 Yaclual 2 51 Weights Adjusting 0 After each iteration weights should be adjusted to minimize the error All possible weights Hebbian learning Artificial evolution Back propagation 52 Back Propagation N is a neuron NW is one of N s inputs weights Nout is N s output NWNWANW ANW 1N N N This works only for the last layer as we can know the actual output and the expected output What about the hidden layer ErrorFactor ErrorFactorN ExpectedOutput NActualOutput 53 Weights Adjusting Error Factor of X1 X1 K w AYlW x1w of Y1 vim AYZW Xlw of Y2 AY3W x1w of Y3 0 r3 AY3W x1w of Y3 O W Number of neurons Many neurons Higher accuracy Slower Risk of over fitting Memorizing rather than understanding The network will be useless with new problems Few neurons Lower accuracy Inability to learn at all Optimal number Trail and error Adaptive techniques 55 Data representation Usually inputoutput data needs I r r Wing Pictures Pixel intensity Smells Molecule concentrations Text A pattern eg 001 for quotChrisquot 010 for quotBeckyquot Numbers Decimal or binary 56 Size of training set No one fits all formula Some heuristics Five to ten times training samples as the number of weights W Greater than T W number of weights a desired accuracy logi Greater than 1a 1a n number of nodes 57 Applications Areas Function approximation Classification Clustering Pattern recognition radar systems face identification 58 Applications 0 Electronic Nose NETtaIk Electronic Nose Developed by Dr Amy Ryan at JPL NASA Goal Detect low concentrations of dangerous gases Ammonia is dangerous at a concentration of a few parts per million ppm Humans can39t sense it until it reaches about 50 ppm 0 An E Nose can differentiate between Pepsi and Coca Cola Can you 60 Electronic Nose An E Nose uses a collection of 16 different polymer films These films are specially designed to conduct electricity When a substance is absorbed into these films the films expand slightly and that changes how much electricity they conduct 61 Electronic Nose Because each film is made of a different polymer each one reacts to each substance or analyte in a slightly different way An artificial neural network combines the differences in conductivity to detect the substance that caused them 62 NETtalk Experiment Carried out in the mid 19805 by Terrence Sejnowski and Charles Rosenberg Goal teach a computer how to pronounce words The network is fed words and phonemes and articulatory features It really learns Listen Advantages Disadvantages Advantages Adapt to unknown situations Powerful it can model complex functions Ease of use learns by example and very little user domain specific expertise needed Disadvantages Forgets Not exact Large complexity of the network structure Status of Neural Networks Most ofthe reported applications are still in research stage No formal proofs but they seem to have useful applications that work Conclusion Arifii l Neural Netw rk r an imitati n of h biological neural networks but much simpler ones ANNs can learn via trial and error Many factors affect the performance of AN Ns such as the transfer functions size of training sample network topology weights adjusting algorithm No formal proofs but they seem to have useful applications 66 Refrences BrainNet II Creating A Neural Network Library Electronic Nose NETtalkI Wikipedia A primer on Artificial Neural Networks NeuriCam Artificial Neural Networks Application Peter Andras Introduction to Artificial Neural Networks Nicolas Galoppo von Borries Artificial Neural Networks Torsten Reil What is a quotneural netquot Introduction to Artificial Neural Network Jianna J Zhang Bellingham AI Robotics Society Bellingham WA Elements of Artificial Neural Networks Kishan Mehrotra Chilukuri K Mohan and Sanjay Ranka MIT Press 1996 Thanks to all folks who share there material online Artificial Neural Networks MQhamed Ma El Wakill m0hamedeiwakiinet Learning Paradigms Supervised learning Unsupervised learning Reinforcement learning Supervised learning 0 This is what we have seen so far 0 A network is fed with a set oftraining samples inputs and corresponding output and it uses these samples to learn the general relationship between the inputs and the outputs 0 This relationship is represented by the values of the weights of the trained network Unsupervised learning No desired output is associated with the training data Learning by doing Faster than supervised learning Used to find out structures within data Clustering Compression Reinforcement learning 0 Like supervised learning but Weights adjusting is not directly related to the error value The error value is used to randomly shuffle weights Slow due to randomness Intractable Problems The Classes P and NP Mahamed M IEII Wakill m0hamedelwaki anet 91 PUN Agenda What is a problem Decidable or not The P class The NP Class The NPComplete class What is a problem A problem is a question to be answered What is the value of XY A problem usually has parameters X and Y A decision problem is a version of the problem with only two possible answers Yes orNo Given two numbers X and Y does Y evenly divide X An instance a specific problem instance Does 3 evenly divide 6 Decidable or not A decidable problem is a problem that could be solved using a computer 0 An undecidable problem is a problem that can never be solved using a computer neither now or in the future Only decidable problems Classification We need to classify problems in terms of their com utabilit 0 Three classes P class NP Class NPComplete class P class wrt Computers Problems with at least one algorithm that solves the problem in polynomial time wrt to the input size Polynomial time The number of steps needed relates polynomially to the size of the input Onz On9 Onc where c is a constant but NOT On 02 when n is the size of the input P class wrt Turing Machines Problems solvable in polynomial time using a Deterministic luring IVIacnIne DTM belong to the class P Polynomial time The number of moves needed relates polynomially to the size of the input n2 17n3 9n4 but NOT 2n DTM A Turing machine with a tape head transition function and a set of states P Problem MWST Minimum Weight Spanning Tree Given a weighted Vra h G find the minimum weight spanning tree In other words convert the given graph into a tree that includes all the nodes of the original graph and minimizes the summation of weights of the edges in the resulting tree MWST Example Problem Instance Source httpenwikipediaorgwikiKruska39sialgorithm Kruskal39s algorithm The MWST problem belongs to the P class of problems since there is an algorithm that solves it in polynomial time Kruskal39s algorithm 0n2 Create a forest F a set of trees where each vertex in the graph is a separate tree Create a set S containing all the edges in the graph While S is nonempty Remove an edge with minimum weight from S If that edge connects two different trees then add it to the forest combining two trees into a single tree Otherwise discard that edge MWST Example Possible Solution Source httpenwikipediaorgwikiKruskal39sialgorithm NP class wrt Turing Machines Problems solvable in polynomial time using a Non Deterministic Turing Machine NDTM belong to the class NP NDTM A DTM with two stages of processing guessing and checking NonDeterministic Turing Machine Guessing Guess a solution and then write it down to the tape Checking Evaluate the guess to decide whether it solves the problem or not The number of guessed solutions can be either polynomial or exponential If the number of guessed solutions is polynomial then the NDTM is equivalent to a DTM NP class wrt Computers Problems that can be solved within an exponential time wrt the input size 0 This includes problems that can be solved in polynomial time Important 0 A DTM is a NDTM that has a polynomial number of guesses According to the definition of NP the MWST problem is an NP problem NP Problem Example Travelling Salesman Problem TSP B El Given a number of cities and the costs of traveling from any city to any other city what is the cheapest round trip route that visits each city exactly once and then returns to the starting city Source munm wikipediaorgwikiTraveIingisalesmaniproble Solving the TSP 0 There is no one single algorithm that solves this problem in ol nomial time 69 The only way is to enumerate all possible itineraries and checking them onebyone 0 For n cities there are n routes Polynomial Time Reduction 0 A problem P1 is polynomially reducible to problem P2 if there is a process that takes an instance of P1 as an input and outputs a corresponding instance of P2 in polynomial time P1 a b P2 ab2 a2 b22 NPComplete Class A problem P is NPComplete If P is in NP For every problem L in NP there is a polynomial time reduction from L to P 0 If P1 is NPLomplete and there is pOIynomIaI time reduction from P1 to P2 then P2 is NP Complete NPcomplete problems family tree Subset Problem Clique Problem Vertex Cover Problem Hamiltonian Cycle Travelling Salesman 20 The NP World Source httpenwikipediaorgwikiComplexitviclassesiPiandiNP Intractable Problems The Classes P and NP Mahamed M IEII Waki m0hamedelwaki anet 22 Other NPComplete Problems 104 Mohamed M El Wakil mohamede wakilnet Agenda Why How to describe Independent Set Problem Node Cover Problem Directed Hamilton Circuit Problem Undirected Hamilton Circuit Problem Traveling Salesman Problem Remember A problem P is NP Complete If P is in NP For every problem L in NP there is a polynomial time reduction from Lto P Or If P1 is NPComplete and there is polynomial time reduction from P1 to P then P is NPComplete Why 0 When a problem is proved to NP Complete there is no need to attempt finding an optimal solution for it 0 Every new NP Complete problem re assures that all other NP Complete problems require exponential time to solve NPComplete problems description Name and abbreviation 0 Input how it is represented Output when the output shall be quotYesquot 0 The other NP Complete problem used to prove the NP Completeness of the problem at hand The Independent Set Problem IndependentSet A subset I of the nodes ofa graph G is called an independent set if no two nodes in are connected by an edge in G The red nodes are independent sets Sou rce M athWorId The Independent Set Problem IS Given a graph GVE and integer k Vgtkgt1 does G has an independent set of k or more nodes 0 IS is NP Complete as 1 It is NP 2 There is polynomialtime reduction from 3SAT to IS 1 IS is in NP An IS guess is verifiable in polynomial time using DTM Guess a set Gu of k nodes Check that no two nodes make an edge in G Hence IS is in NP 2 BSAT is reducible to IS in polynomial time nput Ee1e2em E is 3CNF ie E has 3m literals Output A graph G with 3m nodes and some edges k Nodes Every node is given a name nij i is a clause number m2i21 j is a literal number 32j21 x1x2x3 ix1xzx4 x2x3x5 x3x4x5 11 A column corresponds to a clause Edges 12 Put an edge between all pairs of nodes in a column Edges 22 Put an edge between every complementary pairofnodes e e I e e k 95kissetu3be35AT3rn ThatisaH EEEEE Ie The Vertex Cover Problem NodeVertex Cover NCVC A vertex cover of a graph GVE is a subset of V such that for every edge in Euv either uorvisinV o t o 9 Source Wikipedia V 1356 is vertex cover V 245 is another vertex cover Vertex Cover Problem Given a graph GVE and integer k Vgtkgt0 does G have a vertex cover with k or fewer nodes 0 VC is NP Complete as 1 It is NP 2 There is polynomialtime reduction from IS to VC 1 VC is in NP A VC guess is verifiable in polynomial time using DTM Guess Gu a set of k nodes Check that each edge in G has at least one end node in Gu Hence VC is in NP 2 IS is reducible to VC in polynomial time VC is the complement of IS If I is largest independent set of graph GVE then VI is a smallest vertex cover of G 1356 is vc 24 is an IS 0 245 is vc 136 is an is t o 9 Source Wikipedia IS reduction to VC IS Reduction to VC The VC instance G is a copy of the VC instance G The VC instance k is V k where kis is IS k is39 Obviously this could be done in polynomial time 21 The Directed Hamilton Circuit Problem A Hamilton Circuit A Hamiltonian circuit is a cycle in a graph which Visits each vertex exactly once Example Graph Blue HC Black quot 39 o 9 AVA 43 Source Wikipedia Directed Hamilton Circuit Problem DHC Given a directed graph G does it have a Hamilton circuit E D v 2 There is polynomialtime reduction from SSAT to DHCP A 4 l Source Wikipedia DHC is NPComplete as 1 It is NP 24 1 DHC is in NP A DHC guess is verifiable in polynomial time using DTM Guess Gu a cycle in G check if all needed edges are in G Hence DHC is in NP 25 2 3SAT is reducible to DHC in polynomial time This is rather lengthy so we will skip it The Undirected Hamilton Circuit Problem Undirected Hamilton Circuit Problem HC 0 Given an undirected graph G does G have a Hamilton circuit 0 HC is NP Complete as HC is in NP There is a polynomial time reduction from DHC to HC 28 HCis in NP A HC guess is verifiable in polynomial time using DTM Guess Gu a cycle in G check if all needed edges are in G Hence DHC is in NP 2 DHC is reducible to HC in polynomial time Input Directed graph Gd Output Undirected graph Gu 30 Constructing Gu Nodes For every node Vin Gd there are three nodes v0vlv2 in Gu gt 696 696 Constructing Gu Edges Edges For all nodes in Gd there are two edges in Gu v0v1 and v1v2 If there is an edge v w in Gd then add ede v2w0 to Gu The Traveling Salesman Problem A salesman wants to visit all these cities incurring the least possible cost In graph terminology we want to have a Hamilton circuit with the minimum summation of edges weights Traveling Salesman N 2 150 Traveling Salesman Problem TSP 0 Given an undirected weighted graph G and a number k is there a Hamilton circuit in G such that the sum ofthe weights on the edges ofthe HC is less than or equal k 0 TSP is NP Complete as TSP is in NP There is a polynomial time reduction from HC to TSP 35 TSP is in NP A TSP guess is verifiable in polynomial time using DTM Guess an itinerary ie cycle in G Check if the sum of the weights of all involved edges is less than or equal k Hence TSP is in NP 36 2 HC is reducible to TSP in polynomial time Input Undirected graph th Output Undirected weighted graph G tsp Number k 37 G Constructing TSP instance tsp nodes A copy of th nodes GtSIO edges A copy of th edges Set the weight of all edges to be 1 k Set k to be the number of nodes of th Evidently this is polynomial time reduction 38 Conclusion Proving NPCompleteness is not easy The problem must be proved to be NP There should be a polynomial time reduction from another NP Complete problem V C o L HC That s all folks Questions
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'