Parent document is top of "FAQ: comp.ai.genetic part 3/6 (A Guide to Frequently Asked Questions)"
Previous document is "Introduction"
Next document is "Q3: Who is concerned with EAs?"

Q2: What applications of EAs are there?

     In   principle,   EAs  can  compute  any  computable  function,  i.e.
     everything a normal digital computer can do.

     But EAs are especially badly suited for problems where efficient ways
     of  solving  them  are  already  known,  (unless  these  problems are
     intended to serve as benchmarks).  Special purpose  algorithms,  i.e.
     algorithms  that  have  a  certain amount of problem domain knowledge
     hard coded into them, will usually outperform EAs,  so  there  is  no
     black  magic  in EC.  EAs should be used when there is no other known
     problem solving strategy, and  the  problem  domain  is  NP-complete.
     That's  where  EAs  come  into  play: heuristically finding solutions
     where all else fails.

     Following  is  an  incomplete   (sic!)    list   of   successful   EA
     applications:

 BIOCOMPUTING
     Biocomputing, or Bioinformatics, is the field of biology dedicated to
     the automatic analysis of experimental data (mostly sequencing data).
     Several  approaches  to  specific  biocomputing  problems  have  been
     described that involve the use of GA,  GP  and  simulated  annealing.
     General  information  about biocomputing (software, databases, misc.)
     can be found on the server of the European Bioinformatics  Institute:
     http://www.ebi.ac.uk/ebi_home.html  ENCORE  has  a  good selection of
     pointers related to this subject.  VSCN provides  a  detailed  online
     course        on        bioinformatics:       http://www.techfak.uni-
     bielefeld.de/bcd/Curric/welcome.html

     There are three main  domains  to  which  GA  have  been  applied  in
     Bioinformatics: protein folding, RNA folding, sequence alignment.

     Protein Folding

     Proteins  are  one  of  the essential components of any form of life.
     They are made of twenty different types of amino acid.   These  amino
     acids  are  chained  together  in  order to form the protein that can
     contain from a few to several thousands  residues.  In  most  of  the
     cases,  the  properties and the function of a protein are a result of
     its three dimensional structure.  It seems that in  many  cases  this
     structure  is a direct consequence of the sequence. Unfortunately, it
     is still very difficult/impossible to deduce  the  three  dimensional
     structure,  knowing  only  the  sequence.  A part of the VSCN on-line
     bioinformatics course is dedicated to  the  use  of  GAs  in  Protein
     Folding  Prediction.  It  contains  an  extensive  bibliography and a
     detailed presentation of the subject with LOTS  of  explanations  and
     on-line     papers.     The     URL    is:    http://www.techfak.uni-
     bielefeld.de/bcd/Curric/ProtEn/contents.html

     Koza [KOZA92] gives one example of GP  applied  to  Protein  Folding.
     Davis  [DAVIS91]  gives  an example of DNA conformation prediction (a
     closely related problem) in his Handbook of GAs.

     RNA Folding

     Describing the secondary structure of an RNA molecule,  is  about  as
     hard  as  for  a  protein,  but describing the intermediate structure
     (secondary structure) is somehow easier  because  RNA  molecules  are
     using the same pairing rules as DNA, (Watson and Crick base pairing).
     There exist deterministic algorithms that given a set of  constraints
     (rules),  compute  the more stable structure, but: (a) their time and
     memory requirement increase quadratically or more with the length  of
     the sequences, and (b) they require simplified rules.  Lots of effort
     has recently been put into applying GAs to this problem, and  several
     papers  can  be  found (on-line if your institute subscribes to these
     journals):

     A genetic Algorithm Based Molecular Modelling Technique For RNA Stem-
     loop  Structures  H.  Ogata,  Y. Akiyama and M Kanehisa, Nucleic Acid
     Research, 1995, vol 23,3 419-426

     An Annealing Mutation Operator in the GA for RNA folding B.A  Shapiro
     and J. C. Wu, CABIOS, 1996, vol 12, 3, 171-180

     The  computer  Simulation  of  RNA  Folding  Pathway  Using a Genetic
     Algorithm A.P. Gultyaev, F.D.H van Batenburg and C. W.  A.  Pleij  in
     Journal of Molecular Biology, 1995, vol 250 37-51

     Simulated  Annealing  has  also  been  applied  successfully  to this
     problem:

     Description of RNA folding by SA M. Schmitz and G. Steger in  Journal
     of Molecular Biology, 1995, 255, 245-266

     Sequence Alignments

     Sequence  Alignment  is  another important problem of Bioinformatics.
     The aim is to align together several related sequences (from  two  to
     hundreds)  given  a  cost  function.   For the most widely  used cost
     functions, the problem has been shown  to  be  NP-complete.   Several
     attempts have been made using SA:

     Multiple  Sequence Alignment Using SA J. Kim, Sakti Pramanik and M.J.
     Chung, CABIOS, 1994, vol 10, 4, 419-426

     Multiple Sequence Alignment by Parallel SA M. Isshikawa, T. Koya  and
     al, CABIOS, 1993,vol 9, 3, 267-273

     SAM,  software  which uses Hidden Markov Models for Multiple Sequence
     Alignment, can use SA to train the model. Several  papers  have  been
     published  on  SAM.   The  software,  documentation  and an extensive
     bibliography           can           be           found           in:
     http://www.cse.ucsc.edu/research/compbio/sam.html

     More  recently,  various  software using different methods like Gibbs
     sampling or GAs has been developed:

     A Gibbs Sampling Strategy for Multiple Alignment C.E. Lawrence, S. F.
     Altschull and al, Science, October 1993, vol 262, 208-214

     SAGA:  Sequence  Alignment by Genetic Algorithm C. Notredame and D.G.
     Higgins, Nucleic Acid Research, 1995, vol 24, 8,
      1515-1524

     A  beta release of SAGA (along with the paper) is  available  on  the
     European    Bioinformatics    Institute    anonymous    FTP   server:
     ftp.ebi.ac.uk/pub/software/unix/saga.tar.Z

 GAME PLAYING
     GAs can be used to  evolve  behaviors  for  playing  games.  Work  in
     evolutionary  GAME  THEORY  typically  surrounds  the  EVOLUTION of a
     POPULATION of players who meet randomly to play a game in which  they
     each  must  adopt  one  of  a limited number of moves. (Maynard-Smith
     1982).  Let's suppose it is just two moves,  X  and  Y.  The  players
     receive  a reward, analogous to Darwinian FITNESS, depending on which
     combination of moves occurs and which  move  they  adopted.  In  more
     complicated models there may be several players and several moves.

     The  players  iterate such a game a series of times, and then move on
     to a new partner. At the end of all such moves, the players will have
     a cumulative payoff, their fitness.  This fitness can then be used as
     a means of conducting something akin to Roulette-Wheel  SELECTION  to
     generate a new population.

     The  real  key  in  using  a  GA  is  to  come up with an encoding to
     represent player's strategies, one that is amenable to CROSSOVER  and
     to MUTATION.  possibilities are to suppose at each iteration a player
     adopts X with some probability (and Y with one minus such). A  player
     can  thus  be  represented  as  a  real  number,  or  a bit-string by
     interpreting the decimal value of the bit string as  the  inverse  of
     the probability.

     An  alternative  characterisation  is  to model the players as Finite
     State Machines, or Finite Automata (FA). These can be though of as  a
     simple  flow chart governing behaviour in the "next" play of the game
     depending upon previous plays. For example:

	  100 Play X
	  110 If opponent plays X go to 100
	  120 Play Y
	  130 If opponent plays X go to 100 else go to 120
     Represents a strategy that does whatever its opponent did  last,  and
     begins  by  playing  X,  known as "Tit-For-Tat." (Axelrod 1982). Such
     machines can readily be encoded as bit-strings. Consider the encoding
     "1  0  1  0 0 1" to represent TFT.  The first three bits, "1 0 1" are
     state 0. The first bit, "1" is interpreted as "Play  X."  The  second
     bit,  "0"  is interpreted as "if opponent plays X go to state 1," the
     third bit, "1", is interpreted as "if the opponent  plays  Y,  go  to
     state  1."   State 1 has a similar interpretation. Crossing over such
     bit-strings always yields valid strategies.

     SIMULATIONs in the Prisoner's dilemma have been  undertaken  (Axelrod
     1987, Fogel 1993, Miller 1989) of these machines.

     Alternative   representations  of  game  players  include  CLASSIFIER
     SYSTEMs (Marimon, McGrattan and Sargent 1990, [GOLD89]), and  Neural-
     networks  (Fogel and Harrald 1994), though not necessarily with a GA.
     (Fogel  1993),  and  Fogel  and  Harrald  1994  use  an  Evolutionary
     Program).

     Other methods of evolving a population can be found in Lindgren 1991,
     Glance and Huberman 1993 and elsewhere.

     References.

     Axelrod, R. (1987) ``The Evolution  of  Strategies  in  the  Repeated
     Prisoner's Dilemma,'' in [DAVIS91]
     Miller,  J.H.  (1989)  ``The  Coevolution of Automata in the Repeated
     Prisoner's Dilemma'' Santa Fe Institute Working Paper 89-003.

     Marimon, Ramon, Ellen McGrattan and Thomas J. Sargent (1990)  ``Money
     as  a  Medium of Exchange in an Economy with Artificially Intelligent
     Agents'' Journal of Economic Dynamics and Control 14, pp. 329--373.

     Maynard-Smith, (1982) Evolution and the Theory of Games, CUP.

     Lindgren, K. (1991) ``Evolutionary Phenomena in Simple Dynamics,'' in
     [ALIFEI].

     Holland, J.H and John Miller (1990) ``Artificially Adaptive Agents in
     Economic Theory,'' American Economic Review: Papers  and  Proceedings
     of  the  103rd  Annual Meeting of the American Economics Association:
     365--370.

     Huberman, Bernado,  and  Natalie  S.  Glance  (1993)  "Diversity  and
     Collective   Action"   in   H.   Haken   and   A.   Mikhailov  (eds.)
     Interdisciplinary Approaches to Nonlinear Systems, Springer.

     Fogel (1993) "Evolving Behavior in the Iterated  Prisoner's  Dilemma"
     Evolutionary Computation 1:1, 77-97

     Fogel,  D.B.  and  Harrald, P. (1994) ``Evolving Complex Behaviour in
     the Iterated Prisoner's Dilemma,'' Proceedings of the  Fourth  Annual
     Meetings of the Evolutionary Programming Society, L.J. Fogel and A.W.
     Sebald eds., World Science Press.

     Lindgren, K. and Nordahl, M.G.  "Cooperation and Community  Structure
     in Artificial Ecosystems", Artificial Life, vol 1:1&2, 15-38

     Stanley,  E.A.,  Ashlock,  D.  and  Tesfatsion,  L.  (1994) "Iterated
     Prisoners Dilemma with Choice and Refusal of Partners  in  [ALIFEIII]
     131-178

 JOB-SHOP SCHEDULING
     The  Job-Shop  Scheduling  Problem  (JSSP)  is  a  very difficult NP-
     complete problem which, so far, seems best addressed by sophisticated
     branch  and  bound  search  techniques.  GA researchers, however, are
     continuing to make  progress  on  it.   (Davis  85)  started  off  GA
     research  on  the  JSSP,  (Whitley  89)  reports  on  using  the edge
     RECOMBINATION operator (designed initially for the TSP) on JSSPs too.
     More  recent work includes (Nakano 91),(Yamada & Nakano 92), (Fang et
     al. 93).  The latter three  report  increasingly  better  results  on
     using  GAs on fairly large benchmark JSSPs (from Muth & Thompson 63);
     neither consistently outperform branch & bound search yet,  but  seem
     well  on  the  way.  A  crucial  aspect  of such work (as with any GA
     application) is the method used to  encode  schedules.  An  important
     aspect of some of the recent work on this is that better results have
     been obtained by rejecting the conventional wisdom  of  using  binary
     representations   (as  in  (Nakano  91))  in  favor  of  more  direct
     encodings. In (Yamada & Nakano 92), for example,  a  GENOME  directly
     encodes operation completion times, while in (Fang et al. 93) genomes
     represent implicit instructions for building a schedule. The  success
     of  these  latter techniques, especially since their applications are
     very important in industry, should eventually spawn  advances  in  GA
     theory.

     Concerning  the point of using GAs at all on hard job-shop scheduling
     problems, the same goes here as suggested  above  for  `Timetabling':
     The   GA   approach  enables  relatively  arbitrary  constraints  and
     objectives to be incorporated painlessly into a  single  OPTIMIZATION
     method.   It   is  unlikely  that  GAs  will  outperform  specialized
     knowledge-based  and/or  conventional  OR-based  approaches  to  such
     problems  in  terms  of  raw solution quality, however GAs offer much
     greater simplicity and flexibility, and so, for example, may  be  the
     best method for quick high-quality solutions, rather than finding the
     best possible solution at any cost. Also, of course,  hybrid  methods
     will  have a lot to offer, and GAs are far easier to parallelize than
     typical knowledge-based/OR methods.

     Similar to the JSSP is  the  Open  Shop  Scheduling  Problem  (OSSP).
     (Fang  et  al.  93) reports an initial attempt at using GAs for this.
     Ongoing results from the same source shows  reliable  achievement  of
     results  within  less than 0.23% of optimal on moderately large OSSPs
     (so far, up to 20x20), including an  improvement  on  the  previously
     best known solution for a benchmark 10x10 OSSP. A simpler form of job
     shop problem is the Flow-Shop Sequencing problem;  recent  successful
     work on applying GAs to this includes (Reeves 93)."

     Other scheduling problems

     In  contrast  to  job  shop  scheduling  some  maintenance scheduling
     problems consider which  activities  to  schedule  within  a  planned
     maintenance  period,  rather  than seeking to minimise the total time
     taken by the activities. The constraints on which parts may be  taken
     out  of  service  for  maintenance  at  particular  times may be very
     complex, particularly as they will in general interact. Some  initial
     work is given in (Langdon, 1995).

     References

     Davis,  L.  (1985)  "Job-Shop  Scheduling  with  Genetic Algorithms",
     [ICGA85], 136-140.

     Muth, J.F. & Thompson, G.L. (1963) "Industrial Scheduling".  Prentice
     Hall, Englewood Cliffs, NJ, 1963.

     Nakano,  R.  (1991)  "Conventional  Genetic  Algorithms  for Job-Shop
     Problems", [ICGA91], 474-479.

     Reeves, C.R. (1993) "A Genetic Algorithm  for  Flowshop  Sequencing",
     Coventry Polytechnic Working Paper, Coventry, UK.

     Whitley,  D.,  Starkweather,  T.  &  D'Ann  Fuquay (1989) "Scheduling
     Problems and  Traveling  Salesmen:  The  Genetic  Edge  Recombination
     Operator", [ICGA89], 133-140.

     Fang,  H.-L.,  Ross,  P.,  &  Corne  D.  (1993)  "A Promising Genetic
     Algorithm Approach to Job-Shop Scheduling, Rescheduling  &  Open-Shop
     Scheduling Problems", [ICGA93], 375-382.

     Yamada,  T.  &  Nakano,  R. (1992) "A Genetic Algorithm Applicable to
     Large-Scale Job-Shop Problems", [PPSN92], 281-290.

     Langdon, W.B. (1995) "Scheduling  Planned  Maintenance  of  the  (UK)
     National Grid", cs.ucl.ac.uk:/genetic/papers/grid_aisb-95.ps

 MANAGEMENT SCIENCES
     "Applications  of EA in management science and closely related fields
     like organizational ecology is a domain that has been covered by some
     EA  researchers - with considerable bias towards scheduling problems.
     Since I believe that EA have considerable potential for  applications
     outside   the   rather   narrow  domain  of  scheduling  and  related
     combinatorial problems, I started  collecting  references  about  the
     status  quo  of  EA-applications  in management science.  This report
     intends to make available my findings to  other  researchers  in  the
     field.  It  is  a  short  overview  and  lists some 230 references to
     current as well as finished research projects.  [..]

     "At the end of the paper, a questionnaire has been incorporated  that
     may be used for this purpose. Other comments are also appreciated."

     --- from the Introduction of (Nissen 93)

     References

     Nissen,  V. (1993) "Evolutionary Algorithms in Management Science: An
     Overview and List of References", Papers on Economics and  Evolution,
     edited  by the European Study Group for Evolutionary Economics.  This
     report     is     also     avail.     via     anon.      FTP     from
     ftp.gwdg.de:/pub/msdos/reports/wi/earef.eps

     Boulding,  K.E.  (1991) "What is evolutionary economics?", Journal of
     Evolutionary Economics, 1, 9-17.

 TIMETABLING
     This has been addressed quite successfully with GAs.  A  very  common
     manifestation  of this kind of problem is the timetabling of exams or
     classes in Universities, etc.

     The first application of GAs to the timetabling problem was to  build
     the  schedule  of  the  teachers  in  an  Italian  high  school.  The
     research, conducted at the Department of Electronics and  Information
     of Politecnico di Milano, Italy, showed that a GA was as good as Tabu
     Search, and better  than  simulated  annealing,  at  finding  teacher
     schedules  satisfying  a  number  of  hard and soft constraints.  The
     software package developed is now in current use in some high schools
     in Milano. (Colorni et al 1990)

     At   the   Department   of  Artificial  Intelligence,  University  of
     Edinburgh, timetabling the MSc exams is now done using a GA (Corne et
     al.  93,  Fang  92).  An  example  of  the use of GAs for timetabling
     classes is (Abramson & Abela 1991).

     In the exam timetabling case,  the  FITNESS  function  for  a  GENOME
     representing a timetable involves computing degrees of punishment for
     various problems with the timetable, such as  clashes,  instances  of
     students  having  to  take  consecutive  exams, instances of students
     having (eg) three or more exams in  one  day,  the  degree  to  which
     heavily-subscribed  exams  occur  late  in the timetable (which makes
     marking harder), overall length of timetable, etc. The modular nature
     of the fitness function has the key to the main potential strength of
     using GAs for this sort of thing as  opposed  to  using  conventional
     search  and/or  constraint  programming  methods. The power of the GA
     approach is the ease with which it  can  handle  arbitrary  kinds  of
     constraints  and  objectives;  all  such  things  can  be  handled as
     weighted components of the fitness function, making it easy to  adapt
     the  GA  to  the  particular  requirements  of  a  very wide range of
     possible overall objectives . Very few other timetabling methods, for
     example,  deal with such objectives at all, which shows how difficult
     it is (without  GAs)  to  graft  the  capacity  to  handle  arbitrary
     objectives  onto  the  basic "find shortest- length timetable with no
     clashes" requirement.  The  proper  way  to  weight/handle  different
     objectives  in  the  fitness  function  in relation to the general GA
     dynamics remains, however, an important research problem!

     GAs thus offer a combination of simplicity, flexibility & speed which
     competes  very  favorably  with other approaches, but are unlikely to
     outperform  knowledge-based  (etc)  methods  if  the  best   possible
     solution  is  required at any cost. Even then, however, hybridisation
     may yield the best of both worlds; also, the ease (if the hardware is
     available!)  of implementing GAs in parallel enhances the possibility
     of using them for good, fast solutions to very hard  timetabling  and
     similar problems.

     References
     Abramson & Abela (1991) "A Parallel Genetic Algorithm for Solving the
     School Timetabling Problem",  Technical  Report,  Division  of  I.T.,
     C.S.I.R.O,   April   1991.    (Division  of  Information  Technology,
     C.S.I.R.O., c/o Dept.  of  Communication  &  Electronic  Engineering,
     Royal  Melbourne  Institute  of  Technology,  PO BOX 2476V, Melbourne
     3001, Australia)

     Colorni A., M. Dorigo & V. Maniezzo (1990).  Genetic  Algorithms  And
     Highly  Constrained Problems: The Time-Table Case. Proceedings of the
     First International Workshop on Parallel Problem Solving from Nature,
     Dortmund,  Germany,  Lecture Notes in Computer Science 496, Springer-
     Verlag,                                                        55-59.
     http://iridia.ulb.ac.be/dorigo/dorigo/conferences/IC.01-PPSN1.ps.gz

     Colorni  A.,  M.  Dorigo & V. Maniezzo (1990).  Genetic Algorithms: A
     New Approach to the Time-Table Problem. NATO ASI  Series,  Vol.F  82,
     COMBINATORIAL  OPTIMIZATION,  (Ed.  M.Akguel  and  others), Springer-
     Verlag,                                                      235-239.
     http://iridia.ulb.ac.be/dorigo/dorigo/conferences/IC.02-NATOASI90.ps.gz

     Colorni A., M. Dorigo & V. Maniezzo (1990).  A Genetic  Algorithm  to
     Solve   the   Timetable   Problem.    Technical  Report  No.  90-060,
     Politecnico              di              Milano,               Italy.
     http://iridia.ulb.ac.be/dorigo/dorigo/tec.reps/TR.01-TTP.ps.gz

     Corne,  D. Fang, H.-L. & Mellish, C. (1993) "Solving the Modular Exam
     Scheduling Problem with Genetic  Algorithms".   Proc.  of  6th  Int'l
     Conf.  on  Industrial  and  Engineering  Applications  of  Artificial
     Intelligence & Expert Systems, ISAI.

     Fang,  H.-L.  (1992)  "Investigating   GAs   for   scheduling",   MSc
     Dissertation,   University   of   Edinburgh   Dept.   of   Artificial
     Intelligence, Edinburgh, UK.

 CELLULAR PROGRAMMING: Evolution of Parallel Cellular Machines
     Nature abounds in systems involving the actions of  simple,  locally-
     interacting   components,   that  give  rise  to  coordinated  global
     behavior.  These collective systems have evolved by means of  natural
     SELECTION  to  exhibit  striking  problem-solving  capacities,  while
     functioning within a complex, dynamic ENVIRONMENT.  Employing  simple
     yet  versatile  parallel  cellular  models, coupled with EVOLUTIONARY
     COMPUTATION techniques,  cellular  programming  is  an  approach  for
     constructing  man-made  systems  that exhibit characteristics such as
     those manifest by their natural counterparts.

     Parallel cellular machines hold  potential  both  scientifically,  as
     vehicles  for studying phenomena of interest in areas such as complex
     adaptive  systems  and  ARTIFICIAL  LIFE,  as  well  as  practically,
     enabling   the   construction   of   novel   systems,   endowed  with
     evolutionary, reproductive, regenerative, and learning  capabilities.

     Web site: http://lslwww.epfl.ch/~moshes/cp.html

     References:

     Sipper,  M.  (1997)  "Evolution  of  Parallel  Cellular Machines: The
     Cellular Programming Approach", Springer-Verlag, Heidelberg.

     Sipper, M.  (1996)  "Co-evolving  Non-Uniform  Cellular  Automata  to
     Perform Computations", Physica D, 92, 193-208.

     Sipper,  M.  and  Ruppin,  E.  (1997)  "Co-evolving architectures for
     cellular machines", Physica D, 99, 428-441.

     Sipper, M. and  Tomassini,  M.  (1996)  "Generating  Parallel  Random
     Number  Generators By Cellular Programming", International Journal of
     Modern Physics C, 7(2), 181-190.

     Sipper, M. (1997) "Evolving Uniform and Non-uniform Cellular Automata
     Networks",  in  Annual  Reviews of Computational Physics, D. Stauffer
     (ed)

 Evolvable Hardware
     The idea of evolving machines, whose origins can  be  traced  to  the
     cybernetics  movement  of  the  1940s  and  the  1950s,  has recently
     resurged in the form of the nascent field of bio-inspired systems and
     evolvable  hardware.  The  field draws on ideas from the EVOLUTIONARY
     COMPUTATION  domain  as  well  as  on  novel  hardware   innovations.
     Recently,  the  term evolware has been used to describe such evolving
     ware, with  current  implementations  centering  on  hardware,  while
     raising  the  possibility of using other forms in the future, such as
     bioware.  The inaugural workshop, Towards  Evolvable  Hardware,  took
     place   in   Lausanne,   in  October  1995,  followed  by  the  First
     International  Conference  on  Evolvable  Systems:  From  Biology  to
     Hardware  (ICES96)  held  in  Japan,  in October 1996. The next major
     event in the field, ICES98, will be held in Lausanne, Switzerland, in
     September 1998.

     References:

     Sipper,  M. et al (1997) "A Phylogenetic, Ontogenetic, and Epigenetic
     View  of  Bio-Inspired  Hardware  Systems",  IEEE   Transactions   on
     Evolutionary Computation, 1(1).

     Sanchez,  E.  and  Tomassini,  M.  (eds)  (1996)  "Towards  Evolvable
     Hardware", Springer-Verlag, Lecture Notes in Computer Science,  1062.

     Higuchi,   T.  et  al  (1997)  "Proceedings  of  First  International
     Conference on Evolvable Systems: From Biology to Hardware  (ICES96)",
     Springer-Verlag, Lecture Notes in Computer Science.

Parent document is top of "FAQ: comp.ai.genetic part 3/6 (A Guide to Frequently Asked Questions)"
Previous document is "Introduction"
Next document is "Q3: Who is concerned with EAs?"