To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Explanation-based learning

From Wikipedia, the free encyclopedia

Explanation-based learning (EBL) is a form of machine learning that exploits a very strong, or even perfect, domain theory (i.e. a formal theory of an application domain akin to a domain model in ontology engineering, not to be confused with Scott's domain theory) in order to make generalizations or form concepts from training examples.[1] It is also linked with Encoding (memory) to help with Learning. [2]

YouTube Encyclopedic

  • 1/3
    Views:
    951 797
    664
    2 446
  • Project Based Learning: Explained.
  • Learning Engineering at 50
  • A Transformation to Learning Engineering

Transcription

You remember what it was like in school. It was boring! You sat in class, memorized as much as you could and tried to pass the test at the end. But is that good enough? These days school can be more interesting and effective. By focusing students on work that matters. This is Project Based Learning Explained. Most adults live in a world of projects. Whether it's a job assignment, home improvement or planning a wedding, we need to actively solve problems. But unfortunately, schoolwork looks more like this than this. Let's take a look at this Project Based World. Meet Claire. She was recently presented with a challenge. Her company, Super Suds, makes soap and it was up to her to find the most earth-friendly way to produce it in the future. Her boss gave her a budget and a few requirements- and it was up to her to come up with a solution. She organized and managed a team who researched the options and created materials summarizing the issues. Claire's team asked for feedback and presented their findings to the boss. Claire came out of the project looking like a rock star--and she learned a lot about green products. If you look closely, Claire's success involved critical thinking, collaboration, and communication. Things than aren't often taught in traditional classrooms. The world needs more Claires. So how do we get them? The answer is Project Based Learning or PBL. By focusing students on a project, teachers put them on a path that deepens their knowledge and builds skills they'll need in the future. Here's what I mean. Mr. Simmons has always been a good science teacher and his students do well on Friday's tests. Unfortunately, what they learned is gone by Saturday morning. That wasn't good enough for him. Soon he learned about Project Based Learning and decided to give it a shot. Mr. Simmons got the idea for his first project, on microorganisms, when nearly half of his students were suddenly absent with the flu. He asked his students why they thought so many of their classmates got sick at the same time. That lively discussion produced a lot of good questions and a list of things that kids wanted to know. Mr. Simmons then announced their project was to help elementary school kids understand, How can we do not get sick? After dividing the class into teams, he got them started on the project. It was up to the students to ask questions, research, collaborate, give each other feedback and figure out the best ways to make their points clear to children. One team chose to make an educational video on the connection between hand washing and avoiding the flu. Another chose to create posters to show how viruses spread. The project teams showed off their final work to an enthusiastic audience of parents and their children at a nearby elementary school. Sitting in the front row was our good friend and local rock star Claire, who saw a bit of herself in the students. It was clear that the project was a success for the students, the audience, and Mr. Simmons. His students practiced critical thinking, collaboration and communication. The project wasn't about memorization, but learning in-depth about viruses and how to prevent spreading disease. A lesson they will never forget. At the end of the presentation, Claire introduced herself to the students and told them that they were rock stars and that the world needs more people who can think like them. To learn more about Project Based Learning, go to BIE.org.

Details

An example of EBL using a perfect domain theory is a program that learns to play chess through example. A specific chess position that contains an important feature such as "Forced loss of black queen in two moves" includes many irrelevant features, such as the specific scattering of pawns on the board. EBL can take a single training example and determine what are the relevant features in order to form a generalization.[3]

A domain theory is perfect or complete if it contains, in principle, all information needed to decide any question about the domain. For example, the domain theory for chess is simply the rules of chess. Knowing the rules, in principle, it is possible to deduce the best move in any situation. However, actually making such a deduction is impossible in practice due to combinatoric explosion. EBL uses training examples to make searching for deductive consequences of a domain theory efficient in practice.

In essence, an EBL system works by finding a way to deduce each training example from the system's existing database of domain theory. Having a short proof of the training example extends the domain-theory database, enabling the EBL system to find and classify future examples that are similar to the training example very quickly.[4] The main drawback of the method—the cost of applying the learned proof macros, as these become numerous—was analyzed by Minton.[5]

Basic formulation

EBL software takes four inputs:

  • a hypothesis space (the set of all possible conclusions)
  • a domain theory (axioms about a domain of interest)
  • training examples (specific facts that rule out some possible hypothesis)
  • operationality criteria (criteria for determining which features in the domain are efficiently recognizable, e.g. which features are directly detectable using sensors)[6]

Application

An especially good application domain for an EBL is natural language processing (NLP). Here a rich domain theory, i.e., a natural language grammar—although neither perfect nor complete, is tuned to a particular application or particular language usage, using a treebank (training examples). Rayner pioneered this work.[7] The first successful industrial application was to a commercial NL interface to relational databases.[8] The method has been successfully applied to several large-scale natural language parsing systems,[9] where the utility problem was solved by omitting the original grammar (domain theory) and using specialized LR-parsing techniques, resulting in huge speed-ups, at a cost in coverage, but with a gain in disambiguation. EBL-like techniques have also been applied to surface generation, the converse of parsing.[10]

When applying EBL to NLP, the operationality criteria can be hand-crafted,[11] or can be inferred from the treebank using either the entropy of its or-nodes[12] or a target coverage/disambiguation trade-off (= recall/precision trade-off = f-score).[13] EBL can also be used to compile grammar-based language models for speech recognition, from general unification grammars.[14] Note how the utility problem, first exposed by Minton, was solved by discarding the original grammar/domain theory, and that the quoted articles tend to contain the phrase grammar specialization—quite the opposite of the original term explanation-based generalization. Perhaps the best name for this technique would be data-driven search space reduction. Other people who worked on EBL for NLP include Guenther Neumann, Aravind Joshi, Srinivas Bangalore, and Khalil Sima'an.

See also

References

  1. ^ "Special issue on explanation in case-based reasoning". Artificial Intelligence Review. 24 (2). October 2005.
  2. ^ Calin-Jageman, Robert J.; Horn Ratner, Hilary (2005-12-01). "The Role of Encoding in the Self-Explanation Effect". Cognition and Instruction. 23 (4): 523–543. doi:10.1207/s1532690xci2304_4. ISSN 0737-0008. S2CID 145410154.
  3. ^ Black-queen example from Mitchell, Tom (1997). Machine Learning. McGraw-Hill. pp. 308–309. ISBN 0-07-042807-7.
  4. ^ Mitchell, Tom (1997). Machine Learning. McGraw-Hill. pp. 320. ISBN 0-07-042807-7. In its pure form, EBL involves reformulating the domain theory to produce general rules that classify examples in a single inference step.
  5. ^ Minton, Steven (1990). "Quantitative Results Concerning the Utility Problem in Explanation-Based Learning". Artificial Intelligence. 42 (2–3): 363–392. doi:10.1016/0004-3702(90)90059-9.
  6. ^ Keller, Richard (1988). "Defining operationality for explanation-based learning" (PDF). Artificial Intelligence. 35 (2): 227–241. doi:10.1016/0004-3702(88)90013-6. Retrieved 2009-02-22. Current Operationality Defn.: A concept description is operational if it can be used efficiently to recognize instances of the concept it denotes After stating the common definition, the paper actually argues against it in favor of more-refined criteria.
  7. ^ Rayner, Manny (1988). "Applying Explanation-Based Generalization to Natural Language Processing". Procs. International Conference on Fifth Generation Computing, Kyoto. pp. 1267–1274.
  8. ^ Samuelsson, Christer; Manny Rayner (1991). "Quantitative Evaluation of Explanation-Based Learning as an Optimization Tool for a Large-Scale Natural Language System". Procs. 12th International Joint Conference on Artificial Intelligence, Sydney. pp. 609–615.{{cite news}}: CS1 maint: location (link)
  9. ^ Samuelsson, Christer (1994). Fast Natural-Language Parsing Using Explanation-Based Learning. Stockholm: Doctoral Dissertation, Royal Institute of Technology.
  10. ^ Samuelsson, Christer (1996). "Example-Based Optimization of Surface-Generation Tables". in R. Mitkov and N. Nicolov (eds.) "Recent Advances in Natural Language Processing," vol. 136 of "Current Issues in Linguistic Theory": John Benjamins, Amsterdam.{{cite news}}: CS1 maint: location (link)
  11. ^ Rayner, Manny; David Carter (1996). "Fast Parsing using Pruning and Grammar Specialization". Procs. ACL, Santa Cruz.
  12. ^ Samuelsson, Christer (1994). "Grammar Specialization through Entropy Thresholds". Procs. ACL, Las Cruces. pp. 188–195.
  13. ^ Cancedda, Nicola; Christer Samuelsson (2000). "Corpus-based Grammar Specialization". Procs 4th Computational Natural Language Learning Workshop.{{cite news}}: CS1 maint: location (link)
  14. ^ Rayner, Manny; Beth Ann Hockey; Pierrette Bouillon (n.d.). Putting Linguistics into Speech Recognition: The Regulus Grammar Compiler. ISBN 1-57586-526-2.
This page was last edited on 14 March 2023, at 15:44
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.