Status update on the Software Language Book

My software language book is coming together. :-)

Software Languages: An Introduction
Ralf Lämmel
To appear in Springer Verlag, 2016-17

Call for book usage

If you are doing lectures or if you are otherwise interested in the emerging book, please get in touch. I can share draft material with you, subject to agreeing with some (simple?) copyright-related protocol. Of course, I would be happy to see this work being used and receiving feedback. If you would rather want to see a more polished manuscript, then get in touch anyway, and I am happy to keep you in the loop.

Use cases of the book

Let me just explain how I use the book; please let me know about your needs. 

  • I am readily using this book now as the exclusive resource for my Bachelor-level course on programming language theory, as it covers enough aspects of formal syntax, semantics, interpretation, and type systems. The style in these areas is quite applied as opposed to theoretical. Nevertheless some advanced topics are offered in the relevant chapters, e.g., metametamodels, partial evaluation, and abstract interpretation.
  • I am also using this book to support my Master-level course on software language engineering. Especially the DSL primer and the broad coverage of the notion of software languages and the topics on software metalanguages and software language implementation help me to quickly drill into software language engineering. I should mention that, for the purposes of my research-oriented course, I don’t expect in-depth coverage of those subjects; rather I want to prepare students for research work.

Important characteristics of the book

I already mentioned that the book is written in an applied as opposed to a theoretical style. The book develops simple notations for abstract and concrete syntax which are used throughout the book. Different programming languages are used to implement functionality (semantics, type checking, transformation, translation, etc.). In the context of semantics and type systems, Haskell is preferred. In some illustrations of language implementation, Python and Java are leveraged. In a few places, Prolog is used for representing rewrite systems or deductive systems to provide executable specifications for problems of software language processing. 

There are many exercises throughout the book; they are classified at the levels "basic", "intermediate", and "advanced". Solutions for the exercises are not included into the book, but they should be made available eventually online for the basic and intermediate levels.

Timeline for finalization

Realistically, I will need all of summer 2016 to finish and mature all content. In August, I plan to provide a version for reviewing by Springer. Depending on the details of the Springer-driven process and my ability to implement change requests in a timely manner, the book could be in print in 6 months from now (Mid June 2016). I would be very sad (as if you or me cared), if it would take another year. Until then, I am happy to communicate with regard to the possible deployment of the draft material in your courses.

Status of material

These chapters are stable:
  • The notion of software language
    • Software language examples
    • Classification of software languages
    • The lifecycle of software languages
    • Software languages in software engineering
    • Software languages in academia
  • A DSL primer
    • Language concepts
    • Internal DSL style
    • Textual syntax
    • Parsing text to objects
    • Language-agnostic models
    • Language constraints
    • Interpretation
    • Code generation
    • Visual syntax
    • Summary
  • Syntax of software languages
    • Concrete syntax
    • Abstract syntax
    • The metametalevel
    • Syntax mapping
  • Semantics of software languages
    • Pragmatic interpretation
    • Big-step operational semantics
    • Small-step operational semantics
    • Denotational semantics
    • The λ-calculus
    • Partial evaluation
    • Abstract interpretation
  • Types and software languages
    • Type systems
    • Typed calculi
    • Type-system implementation

The following chapters are not ready for your eye; they are released one-by-one to those interested:
  • Preface (introduction to the book)
  • Postface (summary and outlook)
  • Software metalanguages
    • Attribute grammars
    • Rewrite systems
    • Quasi quotation
    • Templates
    • Intermediate representation
  • Software language implementation
    • Parsing
    • Pretty printing
    • Code generation
    • Test-data generation
    • Syntax-directed editing
  • Megamodeling and software languages
    • Entities of interest
    • Relationships of interest
    • Build management and testing
    • Logic-based axiomatization
    • Megamodels of coupled transformations

What’s not in the book?
  • Description of specific platforms for software language implementation such as Eclipse, EMF, TXL, Stratego/XT, and Rascal. The book aims at being technology-agnostic and somewhat timeless. Also, the assumption is that there exists strong material for all important platforms. The book comes with online material though, which covers several platforms. Diverse platforms are mentioned in the right places in the book.
  • Proper coverage of compiler construction topics such as flow analysis, intermediate or machine code-level optimization, and code generation. A cursory discussion of compiler construction topics is included though. There are clearly very good books on compiler construction on the market. Don’t expect this book to be useful for a compiler construction course.

  • A deeper discussion of current research trends in software language engineering. One might thus miss non-trivial coverage of, for example, language workbenches, projectional editing, language embedding, debugging, language integration, and advanced metalanguages, e.g., for name binding or type systems. The reason for exclusion is again that the book aims at being somewhat timeless. Pointers to research works are included, though.



Responding to reviews of rejected conference papers

This post is concerned with this overall question:

How to make good use of reviews for a rejected conference paper?

The obvious answer is presumably something like this:

Extract TODOs from the reviews. Do you work. Resubmit.

In this post, I'd like to advocate an additional element:

Write a commentary on the reviews.

Why would you respond on reviews for a rejected conference paper?

Here are the reasons I can think of:
  • R1You received a review that is clearly weak and you want to complain publicly. I recommend against this complaint model. It is unfriendly with regard to the conference, the chairs, and the reviewers. If one really needs to complain, then one should do this in a friendly manner by direct communication with the conference chair.
  • R2: You spot factual errors in an otherwise serious review and you want to defend yourself publicly. There is one good reason for doing this. Just write it off your chest. There is two good reasons for not doing it. Firstly, chances are that your defense is perceived as an unfriendly complaint; see above. Secondly, why bother and who cares? For instance, sending your defense to the chairs would be weird and useless, I guess.
  • R3: You want to make good use of the reviews along revision and document this properly.

R3 makes sense to me. 

R3 is what this post is about.

We respond to reviews anyway when working on revisions of journal submissions because we have to. One does not make it through a major revision request for a journal paper unless one really makes an effort to properly address the reviewer requests.

Some conferences run a rebuttal model, but this is much different. Rebuttal is about making reviewers understand the paper; revision of a journal paper is more about doing enough of presentational improvements or bits of extra thinking and even research so that a revision is ultimately accepted. 

In the case of a rejected conference paper and its revision, I suggest that a commentary is written in a style, as if the original reviewers were to decide on the revision, even though this will not happen, of course. It remains to be decided on a case-by-case basis whether and how and when the commentary should be made available to whom for what purpose. 

Not that I want my submissions to be rejected, but it happens because of competition and real issues in a paper or the underlying research. My ICMT 2016 submission was received friendly enough, but rightly rejected. The paper is now revised and the paper's website features the ICMT 2016 notification and my commentary on the reviews. In this particular case, I estimated that public access to the notification and my commentary will do more good than bad. At the very least, I can provide a show case for what I am talking about in this blog post.

With the commentary approach, there are some problems that I can think of:
  • P1: Reviewers or conference chairs feel offended. Without being too paranoid, the reviewers or the chairs could receive the commentary as a criticism of their work. For instance, the chair may think that some review was not strong enough to be publicly exposed as a data point of the conference. I have two answers. Firstly, an author should make an effort to avoid explicit or subliminal criticism. (We know how to do this because of how we deal with journal reviews.) Secondly, dear reviewers and chairs, maybe the world would be a better place if more of the review process would be transparent?
  • P2: Prior reviews and your commentary could be misused by reviewers. There is some good reason for not exposing reviewers to other reviews of the same paper (or a prior version thereof), not until they have casted their vote at least, because they may just get biased or they may use these other views without performing a thorough analysis on their own. This is a valid concern. This problem may call for some rules as to i) what conferences are eligible for passing reviews and commentary to each other and ii) when and how commentary can be used by reviewers. 
  • P3: Your commentary is perceived as putting pressure on reviewers of the revision. At this stage of our system, I don't propose that reviewers should be required in any way to consider the commentary on a previous version of a paper because reviewing is already taking too much time. All what I am saying is that reviewers should be given the opportunity to access previous reviews and the author's commentary, at least at some stage of the review process. Reviewers are welcome to ignore the commentary. In fact, some reviewing models may be hard to unite with the notion of commentary. For instance, I don't now whether it would work for the double blind model.

In summary, commentary on rejected conference submissions is a bit like unit testing. We should do it because it helps us to test our interpretation of the reviews in a systematic manner. Without such testing we are likely to i) complain non-constructively about the reviews; ii) ignore legitimate and critical issues pointed out by the reviews; iii) as a consequence, resubmit suboptimal revisions and busy program committees. So we do not really write this commentary for future reviewers; we rather write the commentary for us. However, we write it in a style that it could be used for good by future reviewers. 

Once the community gets a bit more used to this idea, we could deal with commentaries pretty much in the same way as with optional appendices in some conferences. One risk is the one of bias when reviewers are exposed to previous reviews and author claims in the commentary. Another risk is that a badly implemented process for commentaries would just cause more work for both program committees and authors. Maybe, I am thinking a bit too revolutionary here, but I am looking forward a system where we break out of the static nature of program committees and allow for review results and author responses to be passed on from conference to conference. I am thinking of a more social process of reviewing and revision.



CS intro != Programming intro

Yes, "CS intro != Programming intro". Of course, it is not; maybe it would never occur to you that it could!? Well, in some places, though, the CS 101 ("introduction to CS") course ends up being essentially that, at least for some initial part. In some places (Koblenz included; see my OOPM course which is under attack in this post), the CS 101 course certainly ends up trying to also introduce the basics of programming (because the lecturer thought he/she should).

Here is why I think this is wrong and I apologize for coming forward so late:

  • Teaching the trivial level of programming ("getting you going with for-loops and such" :-)) just doesn't belong at the university. Maybe it never belonged there, but it certainly doesn't belong there anymore (for like 10-20 years) because it is just such a basic skill and there is so much guidance out there (Youtube etc.) that a contemporary student wanting to study CS must (or should) simply be beyond the point of needing such a trivial introduction.
  • At a "mediocre" university like mine (not overwhelmed exactly with brilliant students like this), we may get some otherwise possibly smart students however certainly lacking the most basic programming skills. Trying to get them up to speed in a university-style course is just frustrating everyone. It certainly has bored me near to death.
As a result, I am changing my CS 101 course (11 ECTS on paper, which is crazy in itself), per immediately (effective per end of October), as follows:
  • I decouple the objective of teaching basics of programming from other aspects of the course by running two tracks in parallel for some time. I collect these other aspects of the course, which are not about basics of actual programming, as the "theory track". The "programming track" is just there for those who need it. I expect it to be skipped by some students. The theory track will not make any assumptions about previous programming skills.
  • In the theory track, I spend several weeks overall on algebraic specification with applications to abstract data types and what is maybe best called "algebraically oriented domain modeling". See my recent slide deck (.pdf), in that instance Haskell-focused, though. This is joint work with Magne Haveraaen, University of Bergen. Welcome to the university! Students need to understand algebraic signatures and logic formulae. I am going to be nice with the students :-) So I shall limited coverage to pragmatic application of loose algebraic specifications; no term algebra, no quotient algebra, no Herbrand universe, no nothing. Structural induction is in, though! Sorry.
  • As the two tracks have progressed enough, synergy happens such that we implement specifications in Java or Python. Appropriate axioms from the algebraic specifications will lend themselves as assertions and as (parametrized) unit tests. We are also well equipped to discuss design by contract. We will also be prepared to appreciate the beauty (?) of UML class diagrams with OCL constraints. Algebraic specifications and logics are everywhere.
  • Logistically, I will be working mostly without slides. Rather I will use the whiteboard and different editors. The 101wiki of the 101companies project will host code and explanations and links to Wikipedia and other online resources. I will distribute some lectures notes, but no wall of text is going to replace my previous wall of slides. I want to get across (through focused lecture note material) some basic notions and for the rest I use online resources including commented specifications and programs as well as links to Wikipedia et al.
This gets us through 2/3 of the course. The remaining 1/3 of the course will use a single, sequential timeline. I will think about it, as it gets closer, but here are some thoughts:
  • Coverage of EBNF as another modeling method. Great! There is this wonderful connection between algebraic signatures and context-free grammars. Maybe, I will do term algebras anyway and manage to bore everyone to death.
  • Very basics of complexity theory. This can be done as soon as students are fluent in basic algorithmic programming. The same is true for the very basics of program verification. Coverage of program verification (Hoare logic) will definitely benefit from dealing with logic formulae in the theory track earlier.
  • There is a bunch of non-very basic programming topics that I tend to cover such as pointers, encapsulation, interfaces, modularity, inheritance, generics, exceptions. Some order needs to be determined. I have managed in the past.

Luckily, in the second semester, I teach "introduction to functional programming".  It is, of course, wonderful (in the view of the lecturer anyway) to make the students see the connection between algebraic specification and functional programming.


A tiny proof checker in Prolog

The Prolog code for this post is available on GitHub.

I am crazy enough to teach Hoare logic (axiomatic semantics) in the first course for our CS students. For instance, consider the following program (in some simple imperative language), which is supposed to compute quotient q and remainder r according to Euclidian division for operands x and y:

q = 0;
r = x;
while (r >= y) {
   r = r - y;
   q = q + 1; 

Given natural numbers x and y, the postcondition of the program is this:

x == q*y + r && q >= 0 && r>=0 && r < y

The precondition is this is:

x >= 0 && y > 0

We are supposed to prove the "triple" consisting of precondition, program, and postcondition.

Let's face it, verification according to Hoare logic is not easily understood by beginners. Many of my students aren't mathematically / logically fit, when they enter the curriculum. So it is a pain in the neck for them to listen and for me to present. However, I think it is still valuable to expose the students to Hoare logic early on, also in the hopes of related topics such as logics and verification to show up again later so that understanding arises through iterations.

So far I have avoided Hoare logic in my course on programming language theory, as I felt that operational and denotational semantics are more presentable in a programming-based fashion. That is, one can obviously write interpreters both in the operational and the denotational style, while the assumption is here that interpreters (say, programs) are easily digested by the average CS student as opposed to proofs and logics.

This time around, I am going to add axiomatic semantics to said course. To this end, I am presenting axiomatic semantics mainly through a simple proof checker that is developed in Prolog. (As reading material I recommend Chapter 6 of Hanne Riis Nielson, Flemming Nielson: Semantics with Applications: A Formal Introduction, Wiley, 1992 because it is easy enough and excellent overall and available online.) In this manner, the programmatic path is also exercised for axiomatic semantics. As a neat side effect, this also allows me to talk about concepts such as rewriting, normalization, simplification, again, in a programming-based fashion. (These concepts are close to my heart as a software language engineer and programming language researchers.)

Here is a quick look at the proof checker.

The rules of Hoare logic are (of course) represented as definite clauses. For example:

/* In the case of an assignment, its precondition is calculated from
 * its postcondition by substituting the LHS variable by the RHS
 * expression.

proof(P, assign(X, E), Q) :-
    subst(Q, X, E, P).

/* In the case of a statement sequence seq(S1, S2), the proof requires
 * subproofs for S1 and S2 subject to an intermediate assertion that
 * serves as postcondition of S1 and precondition of S2. To this end,
 * a statement sequence is represented here as a term with functor
 * seq_/3 (as opposed to the functor seq/2 of the original abstract
 * syntax) so that the extra intermediate assertion can be expressed.

proof(P, seq_(S1, R, S2), Q) :-
    proof(P, S1, R),
    proof(R, S2, Q).

/* In the case of a while-loop while(E, S), the precondition of the
 * loop is per definition the invariant I of the loop and the
 * postcondition must be equivalent to and(I, not(E)). A subproof is
 * due for the body S of the loop such that the invariant follows as
 * postcondition of S for the precondition and(I, E), i.e., the
 * invariant and the loop's condition hold at the beginning of the
 * execution of the body.

proof(I, while(E, S), Q) :-
    sameAs(and(I, not(E)), Q),
    proof(and(I, E), S, I).

/* The precondition of a proof can be strengthened. While
 * simplification/normalization of assertions is performed
 * automatically, strengthening must be explicitly requested. (This is
 * well in line with the usual extra rule of Hoare logic for
 * strengthening the precondition.) In the following notation, we
 * assume pseudo syntax to express the precondition before
 * strengthening, whereas the precondition of the triple is the
 * strengthened one.

proof(P, pre(R, S), Q) :-
    implies(P, R),
    proof(R, S, Q).

subst/4 is substitution of variables by expressions within assertions.
implies/2 is logical implication.
sameAs/2 is equivalence of assertions modulo normalization.
Some rewrite rules of normalization are included for illustration:

% Reflexivity
rewrite(eq(X, X), true).
rewrite(geq(X, X), true).

% Prefer right-associativity over left-associativity
rewrite(and(and(E1, E2), E3), and(E1, and(E2, E3))).
rewrite(or(or(E1, E2), E3), or(E1, or(E2, E3))).

% Unit laws
rewrite(and(true, E), E).
rewrite(and(E, true), E).
rewrite(or(false, E), E).
rewrite(or(E, false), E).
rewrite(add(number(0), X), X).
rewrite(add(X, number(0)), X).

Here is a simple (incomplete) axiomatization of implication:

% Implication complemented by normalization
implies(E1, E2) :-
    normalize(E1, E3),
    normalize(E2, E4),
    implies_(E3, E4).

% Search rules for implication
implies_(E, E).
implies_(_, true).
implies_(and(E1, E2), and(E3, E4)) :-
    implies_(E1, E3),
    implies_(E2, E4).
implies_(and(E1, _), E2) :-
    implies_(E1, E2).
implies_(and(_, E1), E2) :-
    implies_(E1, E2).
implies_(greater(X, number(N1)), greater(X, number(N2))) :-
    N1 > N2.

Here is a simple test predicate for proving correctness of an if-statement:

% Proof for a simple if-statement that computes the max of a and b
prove_if :-
      greater(var(a), var(b)),
      pre(true, assign(r, var(a))),
      pre(true, assign(r, var(b)))
     or(eq(var(r), var(a)), eq(var(r), var(b)))

In abstract syntax, Euclidean division looks like this:

    assign(q, number(0)),   
      assign(r, var(x)),
        geq(var(r), var(y)),
          assign(r, sub(var(r), var(y))),
          assign(q, add(var(q), number(1)))))))

Here is the proof for Euclidian division; it contains the loop invariant and all other details of the proof.

% Proof for Euclidean division
prove_div :-

      % "As declared" precondition of the program
      % x >= 0 && y > 0
        geq(var(x), number(0)), % As computed
        greater(var(y), number(0))), % Vacuously added for division by zero

      % Strengthen precondition

        % Computed (weakest) precondition of the program
        geq(var(x), number(0)),

        % Beginning of program code

          % q = 0;
          assign(q, number(0)),   

          % Intermediate assertion
            eq(var(x), add(mul(var(q), var(y)), var(x))),
              geq(var(q), number(0)),
              geq(var(x), number(0)))),

            % r = x;
            assign(r, var(x)),
            % Intermediate assertion = invariant for while loop
              eq(var(x), add(mul(var(q), var(y)), var(r))),
                geq(var(q), number(0)),
                geq(var(r), number(0)))),

              % Loop condition                     
              geq(var(r), var(y)),
              % Strengthen precondition
                % Computed precondition of body
                  eq(var(x), add(mul(add(var(q), number(1)), var(y)), sub(var(r), var(y)))),
                    geq(add(var(q), number(1)), number(0)),
                    geq(sub(var(r), var(y)), number(0)))),
                % Loop body
                  % r = r - y;
                  assign(r, sub(var(r), var(y))),

                  % Intermediate assertion
                    eq(var(x), add(mul(add(var(q), number(1)), var(y)), var(r))),
                      geq(add(var(q), number(1)), number(0)),
                      geq(var(r), number(0)))),
                  % q = q + 1;
                  assign(q, add(var(q), number(1))))))))),

      % "As declared" postcondition of the program
      % x == q*y + r && q >= 0 && r>=0 && r < y
        eq(var(x), add(mul(var(q), var(y)), var(r))),
          geq(var(q), number(0)),
            geq(var(r), number(0)),
            not(geq(var(r), var(y))))))

It just works:

?- prove_div.


My Gear Fit Wish List

For like 10 days I own and use a Samsung Gear Fit.

I love it. I don't agree with this negative review.

However, I hope Samsung improves the software in a timely manner.

Here are some proposals, I should edit this blog post, as I see fit.

Bug report: Cycling exercise stops recording because of "no movements"

This happens after 10mins of recording. I think this happens because it uses a sanity check that may be right for running exercises, but there is no arm/hand movements, if you are cycling in a proper way. Within some time window, I can cancel the stop message to make it continue measuring. Anyway, this is highly annoying, as I need to hit a small on-display button while cycling. I don't see how one could not have caught this by basic field testing. [Added on 19 May 2014]

Feature request: Usable wake-up gesture needed

Currently, we have the choice between pushing the small hardware button on the Gear Fit to wake it up or relying on movement-based wakeup such that one lifts up the hand, as if one wanted to look at the display. The first option is clumsy, especially when you are doing sports. The second option is annoying because the display goes on all the time. Simple proposal: provide touch-based wakeup. Alternative: use a more unique movement pattern for wakeup. Some sort of shaking would also be conceivable. Combination of movement and touch Ok, too. [Added on 19 May 2014]

BTW, here is also a forum for the Gear Fit.



Einträge aus dem gefälschten Tagebuch eines Zonenprogrammierers um Weihnachten '88

Ralf Lämmel, 17.12.2013, Redevorlage für eine tierisch ernste Weihnachtsfeier


Derartige Text generieren mitunter gehässige Leserkommentare, welche dann hinterfragen, wie es sein kann, dass Beamte dieses Landes ihre Zeit auf solchen Quatsch verwenden. Deswegen soll hier ich pro forma betonen, dass ich diesen Unsinn zwischen den unbezahlten Überstunden an einem Wochenende durch grobe Vernachlässigung etwaig empfohlener Schlafmengen erstellt habe. Auch möchte ich erwähnen, dass ich als geborener Ostzonenprogrammierer ja nur durch geschichtliche Verwicklungen in diesem System gelandet bin, dessen Überlegenheit ich aber allein schon an der Menge und der Qualität der verfügbaren Bananen ablesen kann.

Wir versetzen uns zurück in das Jahr 1988, also dem Jahr vor 1989, also dem Jahr vor dem Jahr, in dem die Mauer viel -- wie man so sagt. Eigentlich ist die Mauer ja erst viel später vollständig weggetragen worden und bis dahin standen durchaus noch diverse Mauerteile; sie lagen also nicht rum bzw. sie waren nicht schlagartig umgefallen; auch waren sie nicht alle umgefallen worden. So einfach lies sich diese Mauer nicht fällen bzw. umwerfen.

Es ist Weihnachten 1988.

Ein gewisser Gefreiter der Reserve, Ralf Lämmel, geboren in Korl-Morx-Stodt (der Stadt mit den drei “Os”), aufgewachsen in Rostock, militarisiert in Stahnsdorf bei Potsdam, zum Studium der Informatik nach Rostock zurückgekehrt, befindet sich im ersten Semester. Er hat aus unerfindlichen Gründen unlängst und angeblich begonnen, ein Tagebuch zu führen. Hierzu sei erwähnt, dass es zwar keinen Strom und keine Bananen in der Zone gab, aber Papier und Bleistifte waren rationiert verfügbar für politisch konforme Staatsbürger. Im folgenden erlangen wir empirischen Einblicke in die ostzonale Programmiererpsyche durch die Betrachtung ausgewählter Tagebucheinträge um eben Weihnachten 1988 herum.


Sonntag, 18.12.1988

Der Meli-Club gestern war einfach cool. Meine Freundin schaute gut aus. Auch habe ich ein neues Getränk entdeckt -- so ein Wermut, der sich Gotano schimpft. Damit entsage ich “Dozentenblut” und “Bretterknaller”. Es ist so schade, dass ich nicht auch donnerstags im Meli-Club sein kein, aber die Vorlesung “Dialektischer Historischer Materialismus” beginnt schon 7:15 Uhr am Freitag. Es gibt wohl eine Hausarbeitsoption für das Fach. Damit könnte ich donnerstags auch in den Meli-Club gehen -- so wie es für meine Freundin und die anderen Grundschulpädagoginnen kein Problem ist.

In dem synaptisch erweiterten Zustand des Gotanogenusses beschlossen XYZ (Name geschwärzt) und ich, eine Forschungsidee im Kontext natürlicher Sprachverarbeitung zu entwickeln. Dazu dachten wir zunächst tiefgründig über den gerade hörbaren Text von Marianne Rosenberg’s "Er gehört zu mir" nach, welcher zu einem Meli-Club-Abend so gehört wie der Name an der Tür.

Erste Strophe von Marianne Rosenberg’s “Er gehört zu mir”:

Er gehört zu mir, wie mein Name an der Tür,
und ich weiß, er bleibt hier,
nie vergeß ich unseren ersten Tag,
Naaa naa naa na, na na na
denn ich fühlte gleich, daß er mich mag,
Naaa naa naa na, na na na
ist es wahre Liebe (uuuhhhuuuhhuuu)
die nie mehr vergeht (uhuuuhuu)
oder wird die Liebe, vom Winde verweht?

Wir definierten folgende Forschungsfragen, welche durch eine Analyse mittels natürlicher Sprachverarbeitung bzw. Corpusstudien auch unter Einbezug sozialistischen Liedgutes zu adressieren wären. Auf jeden Fall würde Prolog in diesem Projekt zum Einsatz kommen.


  1. Gibt es charakteristische Elemente in diesem Text, welche dieses Lied als eine Ausgeburt des Klassenfeindes entlarven? Ist dies vielmehr ein unentscheidbares Problem bzw. ein Problem, was nur durch die sogenannte "starke Artificial Intelligence" gelöst werden kann?
  2. Welche subliminalen Elemente führen zu der mysteriösen Massenpsychose in dem sozialistisch geprägten Studentenclub so dass sich politisch gebildete Studierende und andere sozialistische Persönlichkeiten diesem imperialistischen Gedankengut zuwenden?
  3. Ist Marianne Rosenberg eventuell dann doch verkappt revolutionär?
  4. Liefert der Text einen Beleg für die imperialistisch etablierte Verhaustierung von Mãnnern?

Ultimativ führte das Lied uns gegen Ende des Abends, also am frühen Morgen, zu allgemeinen Reflektionen über unser beginnendes Informatikstudium und das Leben an sich:

Weitere Fragen:

  1. Warum müssen wir in Pascal und C programmieren? Diese Sprachen sind doch ganz offensichtlich ungeeignet, um einen klaren Gedanken zu fassen? Kann die sozialistische Gesellschaft auf Dauer erfolgreich sein, wenn sie solche degenerierten (imperativen also imperialistischen) Sprachen favorisiert?
  2. Wie erklären wir unseren Freundinnen, was wir denken? Gibt es notfalls auch hinreichend gut aussehende Informatik-Studentinnen und hinreichend viele dergleichen? (Später zugefügte Anmerkung auf dem Rand des Tagebuches: Weitere empirische Untersuchungen suggerieren, dass Mathematikerinnen auch in den Suchraum aufgenommen werden sollten.)

Im Nachhall dieser Überlegungen habe ich gerade einen Brief an die Kreisleitung der SED gesendet, um die politische Integrität unserer Professoren überprüfen zu lassen hinsichtlich der systemfeindlichen Auswahl von Programmiersprachen im Curriculum.

Freitag, 23.12.1988

Ich wurde mit dem Weihnachtsgroßeinkauf beauftragt. Die Ente hat meine Mutter schon ergattert so dass nur noch Grundnahrungsmittel zu besorgen waren. Hier ist der Kassenzettel:

  • 100 Brötchen zu je 0,05 Mark, insgesamt 5,00 Mark
  • 7 Leckermäulchen zu je 1,00 Mark, insgesamt 7,00 Mark
  • 1kg Kaffee = 8 Packungen zu je 8,75 Mark, insgesamt 70 Mark
  • 10 Flaschen Club-Cola zu je 0,42 Mark, insgesamt 4,20 Mark
  • 1 Schlager-Süßtafel zu 0,80 Mark
  • usw.

(Siehe weitere Preisbeispiele aus der DDR.)

Kaum bin ich zuhause, sehen wir unter auf der Strasse jemand mit Orangen. Ich mache mich dementsprechend wieder auf den Weg in die Kaufhalle. Solche Schnelligkeit zahlt sich aus. Man hat mir dann 1kg Orangen verkauft; mehr gab es nicht pro Person.

Schliesslich durfte ich noch in das Kaufhaus eilen, um ein paar Jahresendflügelfiguren als Schmuck für den Endjahresbaum zu kaufen sowie ein paar Endjahrespuppen ohne Flügel aber mit Mütze und Bart und aus Schokolade -- also zum Verzehr bestimmt.

Montag, 26.12.1988

Das ist ein ganz besonders schrecklicher Montag: Rechenzentrum geschlossen, Freundin bei den Eltern, Alternativen nicht zugreifbar bei geschlossenem Meli-Club, polnische Flugente zu trocken, Orangen ungeniessbar. 

Die Kuba-Orangen sahen lecker grün aus in der Kaufhalle, aber die Drängelei, das Anstehen und das viele Geld waren es nicht wert; die Orangen sind voll mit Kernen und total sauer. Vielleicht ist etwas dran an dem Gerücht, dass die Kuba-Orangen Fidel Castro's Rache sind.

Auch politisch fühle mich ein wenig alleingelassen. Wie war es doch noch schön, als wir bei der Fahne im Verband der Kompanieeinheit ab und an den “Roten Montag” hatten, der uns für das Vaterland mobilisierte. In dieser Situation der totalen Trostlosigkeit verliere ich mich versehentlich beim Westfernsehen. Das ist aber voller Lügen. Die lassen es so ausschauen, als wenn es dort leckere Orangen und Bananen gibt. Die Wahrheit sieht anders aus, wie wir es aus der NVA-Bibel "Vom Sinn des Soldatseins" wissen (Seite 35 in der mir vorliegenden Version):

“Den Lebensinteressen der Völker entgegen steht die Herrschaft des Imperialismus. Auf sein Konto kommen [...] Armut, chronische Unterernährung und hohe Kindersterblichkeit. Die Geschichte hat ihr Urteil über diese historisch überlebte und dem Untergang geweihte Gesellschaftsordnung längst gesprochen.”

Das zeigt doch ganz deutlich, dass es dort keine Orangen, keine Bananen oder sonstige vitaminreiche Kost geben kann. Ich danke Marx und Lenin, dass sie uns eine Alternative gebacken haben. Bei uns gibt immer leckere Äpfel und Möhren. Der Sozialismus hat den Skorbut besiegt.

Freitag, 30.12.1988

Es ist schon eine eindrucksvolle Koinzidenz, dass die aktuelle Hausaufgabe in der “Einführung in die Programmierung” die Warteschlange thematisiert, während Silvester vor der Tür steht und damit das alljährliche Ritual des Knallererwebs ansteht. Am Mittwoch habe ich mich ordentlich gegen 23 Uhr an die Warteschlage angehängt. Das war zu spät! Das Kontingent an Knallern war 42 Minuten nach Ladenöffnung, also um genau 9:42 Uhr erschöpft. Zu diesem Zeitpunkt konnte ich die Ladentür nur nicht einmal sehen.

Allerdings war die Zeit in der Schlange nicht ganz vergebens. Ein Thema in meinem Schlangensegment war die prekäre Wohnungsfrage. Es scheint immer noch geltende Praxis zu sein, dass man entweder bis etwa zum Alter von 30 Jahren bei Muttern leben soll oder man heiraten und ein Kind kriegen soll, um eine Wohnung zu bekommen.

Allerdings scheint das bald besser zu werden. Ein netter Parteifreund konnte mit einem Zitat aus Erich Honecker’s Autobiographie “Aus meinem Leben”  aufwarten (Seite 304 in der mir vorliegenden Version):

“Unser Zentralkommitee beschloß im Oktober 1973 ein Wohnungsbauprogramm, um bis 1990 in der DDR die Wohnungsfrage als soziales Problem zu lösen.”

Sobald meine Freundin wieder vor Ort ist, werde ich ihr mein Aktvitätsdiagramm zur Lösung unserer persönlichen Wohnungsfrage demonstrieren. Das Diagramm enthält ein Fork/Join zur Maximierung der Optionen zum Erhalt einer Wohnung. In einem der parallelen Abläufe vertrauen wir auf Erich's Versprechen und hätten damit spätestens am 31.12.1990 eine Wohnung ohne weiteren physischen Aufwand. Parallel könnten wir aufgrund gesunden Misstrauens auch ein Kind in die Welt setzen und damit eventuell schon im Herbst 1989 eine Wohnung zugewiesen bekommen. Das Diagramm ist in Prolog ausführbar. 

Es bleibt zu hoffen, dass die Betonklötze von Professoren nicht doch wieder ihr C-ähnlichen Gebrechen zur Lösung derartiger Probleme pushen. Nicht alle Professoren sind aber so reaktionär. Prof. XYZ  (Name geschwärzt) hat unlängst von der Sprache Hope erzählt und ich habe noch nie soviel Hoffnung verspürt. Die funktionale Programmiersprache Hope kann Funktionen höherer Ordnung gut ausdrücken, Fallunterscheidungen über Datenmustern beschreiben und sie ist stark getypt. Ich werde meine Westverwandten antickern, mir mehr Informationen dazu zu besorgen. Hope soll bald zu mir gehören, wie die Türklinke zur Tür.


Wer etwas Ernsthaftes über die Geschichte der Informatik in der DDR lesen möchte, dem sei der Artikel Integration der Informatik-Standorte der DDR in den Fakultätentag empfohlen.


Install Mac OS X Mavericks from USB

It's really easy.

Why would you do it?

  1. Do it on many machines without downloading Mavericks time and again.
  2. Do it on a machine that it not working proper in terms of its Mac OS X install.
  3. Do it on a machine where you want to flatten the old Mac OS X.
  4. Do it on a machine that refuses to install in the normal way so that you like to flatten.
For instance, I had a 10.6 machine where the installer would fail because the existing partition was found to be non-usable. Apparently, this is a common problem also covered by the Apple KB. My partition was a standard one, as far as I can tell.

How to prepare a USB install drive?

A somewhat verbose story is given elsewhere:

If you know sudo, then it is trivial:
  1. Format the USB stick with Disc Utility; call it "Untitled" say.
  2. Just download Maverick (as if you wanted to install it).
  3. Run the following sudo command that prepares the USB drive in a few minutes.
sudo /Applications/Install\ OS\ X\ Mavericks.app/Contents/Resources/createinstallmedia --volume /Volumes/Untitled --applicationpath /Applications/Install\ OS\ X\ Mavericks.app --nointeraction

So basically, the installer has this great API to create an install drive!

How to install from the USB drive?

  1. Insert the USB drive into the computer.
  2. Boot and press "option" to see the list of boot devices.
  3. Select the USB drive.
This would launch the installer.

How to reset and install?

If you want to run disk utilities (in order to flatten the previous system), press the "R" key right after having selected the USB drive. This would start the recovery partition on the USB drive, which gives you access to the disk utilities. In this manner, you can, for example, format the old drive and thus prepare for a fresh install of Mavericks leaving the past behind.