https://en.wikipedia.org/wiki/Program_synthesis Jump to content [ ] Main menu Main menu move to sidebar hide Navigation * Main page * Contents * Current events * Random article * About Wikipedia * Contact us * Donate Contribute * Help * Learn to edit * Community portal * Recent changes * Upload file [wikipe] Wikipedia The Free Encyclopedia Search [ ] Search * Create account * Log in [ ] Personal tools * Create account * Log in Pages for logged out editors learn more * Contributions * Talk Contents move to sidebar hide * (Top) * 1Origin * 221st century developments Toggle 21st century developments subsection + 2.1Syntax-guided synthesis * 3The framework of Manna and Waldinger Toggle The framework of Manna and Waldinger subsection + 3.1Proof rules + 3.2Example * 4See also * 5Notes * 6References [ ] Toggle the table of contents Program synthesis [ ] 4 languages * l`rby@ * Francais * hangugeo * Ri Ben Yu Edit links * Article * Talk [ ] English * Read * Edit * View history [ ] Tools Tools move to sidebar hide Actions * Read * Edit * View history General * What links here * Related changes * Upload file * Special pages * Permanent link * Page information * Cite this page * Get shortened URL * Download QR code * Wikidata item Print/export * Download as PDF * Printable version From Wikipedia, the free encyclopedia In computer science, program synthesis is the task to construct a program that provably satisfies a given high-level formal specification. In contrast to program verification, the program is to be constructed rather than given; however, both fields make use of formal proof techniques, and both comprise approaches of different degrees of automation. In contrast to automatic programming techniques, specifications in program synthesis are usually non- algorithmic statements in an appropriate logical calculus.^[1] The primary application of program synthesis is to relieve the programmer of the burden of writing correct, efficient code that satisfies a specification. However, program synthesis also has applications to superoptimization and inference of loop invariants.^ [2] Origin[edit] During the Summer Institute of Symbolic Logic at Cornell University in 1957, Alonzo Church defined the problem to synthesize a circuit from mathematical requirements.^[3] Even though the work only refers to circuits and not programs, the work is considered to be one of the earliest descriptions of program synthesis and some researchers refer to program synthesis as "Church's Problem". In the 1960s, a similar idea for an "automatic programmer" was explored by researchers in artificial intelligence.^[citation needed] Since then, various research communities considered the problem of program synthesis. Notable works include the 1969 automata-theoretic approach by Buchi and Landweber,^[4] and the works by Manna and Waldinger (c. 1980). The development of modern high-level programming languages can also be understood as a form of program synthesis. 21st century developments[edit] This section needs expansion with: a more detailed overview of [icon] contemporary approaches. You can help by adding to it. ( December 2022) The early 21st century has seen a surge of practical interest in the idea of program synthesis in the formal verification community and related fields. Armando Solar-Lezama showed that it is possible to encode program synthesis problems in Boolean logic and use algorithms for the Boolean satisfiability problem to automatically find programs.^[5] Syntax-guided synthesis[edit] In 2013, a unified framework for program synthesis problems called Syntax-guided Synthesis (stylized SyGuS) was proposed by researchers at UPenn, UC Berkeley, and MIT.^[6] The input to a SyGuS algorithm consists of a logical specification along with a context-free grammar of expressions that constrains the syntax of valid solutions.^[7] For example, to synthesize a function f that returns the maximum of two integers, the logical specification might look like this: (f(x,y) = x [?] f(x,y) = y) [?] f(x,y) >= x [?] f(x,y) >= y and the grammar might be: ::= x | y | 0 | 1 | + | ite(, , ) ::= <= where "ite" stands for "if-then-else". The expression ite(x <= y, y, x) would be a valid solution, because it conforms to the grammar and the specification. From 2014 through 2019, the yearly Syntax-Guided Synthesis Competition (or SyGuS-Comp) compared the different algorithms for program synthesis in a competitive event.^[8] The competition used a standardized input format, SyGuS-IF, based on SMT-Lib 2. For example, the following SyGuS-IF encodes the problem of synthesizing the maximum of two integers (as presented above): (set-logic LIA) (synth-fun f ((x Int) (y Int)) Int ((i Int) (c Int) (b Bool)) ((i Int (c x y (+ i i) (ite b i i))) (c Int (0 1)) (b Bool ((<= i i))))) (declare-var x Int) (declare-var y Int) (constraint (>= (f x y) x)) (constraint (>= (f x y) y)) (constraint (or (= (f x y) x) (= (f x y) y))) (check-synth) A compliant solver might return the following output: ((define-fun f ((x Int) (y Int)) Int (ite (<= x y) y x))) The framework of Manna and Waldinger[edit] Non-clausal resolution rules (unifying substitutions not shown) Nr Assertions Goals Program Origin E [ p ] {\displaystyle E 51 [p]} {\displaystyle E [p]} F [ p ] {\displaystyle F 52 [p]} {\displaystyle F [p]} 53 G [ p ] {\displaystyle G s [p]} {\displaystyle G[p]} 54 H [ p ] {\displaystyle H t [p]} {\displaystyle H[p]} E [ true ] [?] F [ false ] {\displaystyle E[{\text {true}}]\lor F[{\text Resolve 55 {false}}]} {\ (51,52) displaystyle E[{\text {true}}]\lor F[{\text {false}}]} ! F [ true ] [?] G [ false ] {\displaystyle \lnot F [{\text{true}}]\land G[{\ Resolve 56 text{false}}]} {\ s (52,53) displaystyle \lnot F[{\ text{true}}]\land G[{\ text{false}}]} ! F [ false ] [?] G [ true ] {\displaystyle \lnot F [{\text{false}}]\land G Resolve 57 [{\text{true}}]} {\ s (53,52) displaystyle \lnot F[{\ text{false}}]\land G[{\ text{true}}]} G [ true ] [?] H [ false ] {\displaystyle G[{\text 58 {true}}]\land H[{\text p ? s : Resolve {false}}]} {\displaystyle t (53,54) G[{\text{true}}]\land H [{\text{false}}]} The framework of Manna and Waldinger, published in 1980,^[9]^[10] starts from a user-given first-order specification formula. For that formula, a proof is constructed, thereby also synthesizing a functional program from unifying substitutions. The framework is presented in a table layout, the columns containing: * A line number ("Nr") for reference purposes * Formulas that already have been established, including axioms and preconditions, ("Assertions") * Formulas still to be proven, including postconditions, ("Goals"), ^[note 1] * Terms denoting a valid output value ("Program")^[note 2] * A justification for the current line ("Origin") Initially, background knowledge, pre-conditions, and post-conditions are entered into the table. After that, appropriate proof rules are applied manually. The framework has been designed to enhance human readability of intermediate formulas: contrary to classical resolution, it does not require clausal normal form, but allows one to reason with formulas of arbitrary structure and containing any junctors ("non-clausal resolution"). The proof is complete when t r u e {\displaystyle {\it {true}}} {\displaystyle {\it {true}}} has been derived in the Goals column, or, equivalently, f a l s e {\ displaystyle {\it {false}}} {\displaystyle {\it {false}}} in the Assertions column. Programs obtained by this approach are guaranteed to satisfy the specification formula started from; in this sense they are correct by construction.^[11] Only a minimalist, yet Turing-complete,^[12] purely functional programming language, consisting of conditional, recursion, and arithmetic and other operators^[note 3] is supported. Case studies performed within this framework synthesized algorithms to compute e.g. division, remainder, ^[13] square root,^[14] term unification,^[15] answers to relational database queries^[16] and several sorting algorithms.^[17]^[18] Proof rules[edit] Proof rules include: * Non-clausal resolution (see table). For example, line 55 is obtained by resolving Assertion formulas E {\displaystyle E} {\displaystyle E} from 51 and F {\ displaystyle F} {\displaystyle F} from 52 which both share some common subformula p {\displaystyle p} {\displaystyle p}. The resolvent is formed as the disjunction of E {\displaystyle E} {\ displaystyle E}, with p {\displaystyle p} {\displaystyle p} replaced by t r u e {\displaystyle {\it {true}}} {\displaystyle {\it {true}}}, and F {\displaystyle F} {\displaystyle F}, with p {\displaystyle p} {\displaystyle p} replaced by f a l s e {\ displaystyle {\it {false}}} {\displaystyle {\it {false}}}. This resolvent logically follows from the conjunction of E {\ displaystyle E} {\displaystyle E} and F {\displaystyle F} {\ displaystyle F}. More generally, E {\displaystyle E} {\ displaystyle E} and F {\displaystyle F} {\displaystyle F} need to have only two unifiable subformulas p 1 {\displaystyle p_{1}} {\ displaystyle p_{1}} and p 2 {\displaystyle p_{2}} {\displaystyle p_{2}}, respectively; their resolvent is then formed from E th {\ displaystyle E\theta } {\displaystyle E\theta } and F th {\ displaystyle F\theta } {\displaystyle F\theta } as before, where th {\displaystyle \theta } {\displaystyle \theta } is the most general unifier of p 1 {\displaystyle p_{1}} {\displaystyle p_ {1}} and p 2 {\displaystyle p_{2}} {\displaystyle p_{2}}. This rule generalizes resolution of clauses.^[19] Program terms of parent formulas are combined as shown in line 58 to form the output of the resolvent. In the general case, th {\ displaystyle \theta } {\displaystyle \theta } is applied to the latter also. Since the subformula p {\displaystyle p} {\ displaystyle p} appears in the output, care must be taken to resolve only on subformulas corresponding to computable properties. * Logical transformations. For example, E [?] ( F [?] G ) {\displaystyle E\land (F\lor G)} {\ displaystyle E\land (F\lor G)} can be transformed to ( E [?] F ) [?] ( E [?] G ) {\displaystyle (E\land F)\lor (E\land G)} {\ displaystyle (E\land F)\lor (E\land G)}) in Assertions as well as in Goals, since both are equivalent. * Splitting of conjunctive assertions and of disjunctive goals. An example is shown in lines 11 to 13 of the toy example below. * Structural induction. This rule allows for synthesis of recursive functions. For a given pre- and postcondition "Given x {\displaystyle x} {\ displaystyle x} such that pre ( x ) {\displaystyle {\textit {pre}}(x)} {\displaystyle {\textit {pre}}(x)}, find f ( x ) = y {\displaystyle f(x)=y} {\displaystyle f(x)=y} such that post ( x , y ) {\displaystyle {\textit {post}}(x,y)} {\displaystyle {\ textit {post}}(x,y)}", and an appropriate user-given well-ordering [?] {\displaystyle \prec } {\displaystyle \prec } of the domain of x {\displaystyle x} {\displaystyle x}, it is always sound to add an Assertion " x ' [?] x [?] pre ( x ' ) [?] post ( x ' , f ( x ' ) ) {\displaystyle x'\prec x\land {\textit {pre}}(x')\ implies {\textit {post}}(x',f(x'))} {\displaystyle x'\prec x\land {\textit {pre}}(x')\implies {\textit {post}}(x',f(x'))}".^[20] Resolving with this assertion can introduce a recursive call to f {\displaystyle f} {\displaystyle f} in the Program term. An example is given in Manna, Waldinger (1980), p.108-111, where an algorithm to compute quotient and remainder of two given integers is synthesized, using the well-order ( n ' , d ' ) [?] ( n , d ) {\displaystyle (n',d')\prec (n,d)} {\displaystyle (n',d')\ prec (n,d)} defined by 0 <= n ' < n {\displaystyle 0\leq n'