TransWikia.com

Programming language prototyping in Mathematica

Mathematica Asked by Math Gaudium on June 9, 2021

Are you aware of any projects using the Wolfram language resp. Mathematica as an environment to explore the design of programming languages – in particular languages with a focus on mathematics (optimization, statistics, music/sound generation, arithmetic theories, logic, etc.)?

By design, I mean less syntax/parsing but rather exploring different semantics and what can be done with it such as theorem proving, machine learning (differentiable programming languages), etc.

I believe that the term rewriting and symbolic nature of the Wolfram language and the vast amount of curated mathematical functions could make it quite useful for such tasks. A project to explore the design of programming languages not related to Mathematica is the K semantic framework.

Recently, there seem to be increased activity to enhance the theorem proving capabilities of Mathematica. Besides, at least two Mathematica based theorem proving frameworks have been developed "in the wild": Theorema and Analytica.

It appears that the Wolfram language could be a productive environment for such research but I cannot find any examples of programming language research based on it.

One Answer

General

Some general answers and comments first.

1

Are you aware of any projects using the Wolfram language resp. Mathematica as an environment to explore the design of programming languages - in particular languages with a focus on mathematics (optimization, statistics, music/sound generation, arithmetic theories, logic, etc.)?

For Wolfram Language (WL) there three principal ways for answering this question:

  1. Using the standard LISP-like languages development process: a system of functions that addresses the problem domain is developed bottom-up and edited and refined until it “fits well” the use cases of the problem domain.

    1. WL is a descendant of LISP.

    2. This approach can be seen as “language extension.”

    3. For a more detailed description see the preface and introduction of [PG1].

    4. This is the reason I prefer the new functions I define to look and feel as “part of the system.”

  2. Paraphrasing an observation about LISP -- "WL is a programmable programming language.” Hence, we can just refit its behavior into a language we want it to be.

    1. For example, I have implemented a meta-package that facilitates the creation of monadic programming packages, [AA3].

    2. Also, see MSE’s "How and why to use monadic programming in Mathematica?".

  3. Using grammar specification, related generation of parsers, and programming the corresponding interpreters.

    1. WL has built-in functions for grammar specification (e.g. GrammarRules). But those are only on the cloud (so far), hence, less useful.

    2. Ten years ago I developed a package that provides a system of functional parsers and generation of parsers based on Extended Backus-Naur Form (EBNF). See [AA1].

    3. For example, see this answer of MSE's "General strategies to write big code in Mathematica?".

2

By design, I mean less syntax/parsing but rather exploring different semantics and what can be done with it such as theorem proving, machine learning (differentiable programming languages), etc.

I have very similar goals and I address them by programming the parsers and interpreters for natural language based Domain Specific Languages (DSLs). See for example [AA1, AA2, AAv1, AAv2].

3

I believe that the term rewriting and symbolic nature of the Wolfram language and the vast amount of curated mathematical functions could make it quite useful for such tasks.

Of course, WL is easily the best language for that. (At least in some partial order metric.)

At this point, though, I prefer using Raku not WL. Raku lets me to easily create and manage:

  1. Grammars for multiple natural languages

  2. Interpreters into multiple programming languages

For a fairly advanced example of my approach using Raku see the WTC-2020 presentation "Multi-language Data-Wrangling Conversational Agent"], [AAv3].


Examples

This section has examples that correspond to the numbered items in the first sub-section of the previous section.

Tries (language extension)

I developed a package for creating and manipulating tries (prefix trees) with frequencies, TriesWithFrequencies.m, [AA5]. Below is an example of using its system of functions.

Import["https://raw.githubusercontent.com/antononcube/MathematicaForPrediction/master/TriesWithFrequencies.m"]
words = {"bar", "bars", "barks", "barkeep", "barn", "car", "cars", "caress", "card"};
tr = TrieCreate[Characters /@ words];
opts = {ImageSize -> Medium};
ResourceFunction["GridTableForm"][{{
    TrieForm[tr, opts], 
    TrieForm[TrieShrink@TrieNodeProbabilities@tr, opts] 
   }}, TableHeadings -> {"Orginal trie with frequencies", "Shrunk trie with probabilities"}]

enter image description here

Epidemiology modeling (monadic pipeline)

Since optimization is mentioned in the question here is a monadic pipeline for epidemiological model setup with calibration, [AA4]:

ECMMonUnit[]⟹
  ECMMonSetSingleSiteModel[modelSEI2HR]⟹
  ECMMonAssignInitialConditions[<|TP[0] -> usaPopulation, SP[0] -> usaPopulation - 1, ISSP[0] -> 1|>]⟹
  ECMMonAssignRateRules[KeyDrop[aDefaultPars, {aip, aincp, qsd, ql, qcrf}]]⟹
  ECMMonCalibrate[
   "Target" -> KeyTake[aTargetsShort, {ISSP, DIP}],
   "StockWeights" -> <|ISSP -> 0.8, DIP -> 0.2|>,
   "Parameters" -> <|aip -> {10, 35}, aincp -> {2, 16}, qsd -> {60, 120}, ql -> {20, 160}, qcrf -> {0.1, 0.9}|>,
   DistanceFunction -> EuclideanDistance,
   Method -> {"NelderMead", "PostProcess" -> False},
   MaxIterations -> 1000];

Calibration of epidemiological models means optimization over a set of prescribed parameters and their ranges. (In order to find model best fits for data gathered "in the field.")

The code above is only prescriptive on the optimization process, but note the monadic workflow specification of the model build up.

Clojure expression (functional parsers)

See this answer of MSE’s “How to parse a clojure expression?”.


References

Articles, books

[AA1] Anton Antonov, Natural language processing with functional parsers, (2014), MathematicaForPrediction at WordPress.

[AA2] Anton Antonov, Creating and programming domain specific languages, (2016), MathematicaForPrediction at WordPress.

[AA3] Anton Antonov, Monad code generation and extension, (2017), MathematicaForPrediction at WordPress.

[AA4] Anton Antonov, Epidemiologic Compartmental Modeling Monad, (2020), MathematicaForPrediction at WordPress.

[AA5] Anton Antonov, Tries with frequencies for data mining, (2013), MathematicaForPrediction at WordPress.

[PG1] Paul Graham, On Lisp, (1993), Prentice Hall, 432 pages, paperback. ISBN 0130305529.

Videos

[AAv1] Anton Antonov, Voice-Grammar-Compute-Communicate: Take Control of Your Health Data, (2018), Wolfram Technology Conference 2018.

[AAv2] Anton Antonov, A Conversational Agent for Neural Networks: Construction, Training and Utilization, (2018), Wolfram Technology Conference 2018.

[AAv3] Anton Antonov, Multi-language Data-Wrangling Conversational Agent, (2020), Wolfram Technology Conference 2020.

Correct answer by Anton Antonov on June 9, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP