Python Docstring Processing System The purpose of the Python Docstring Processing System project is to create a standard, modular tool for extracting inline documentation from Python modules and converting it into useful formats, such as HTML, XML, and TeX.
GCC Hacker
I am working on hacking the GCC compiler
Monday, March 04, 2002
Abstract Syntax Tree Optimizations - GNU Project - Free Software Foundation (FSF) Abstract Syntax Tree Optimizations
This page describes ongoing work to improve GCC's tree based optimizations. There is a branch in CVS called, ast-optimizer-branch, which is available to experiment with these optimizers. As these stabilize, they can be submitted for review onto the mainline development tree. Please contact Nathan Sidwell,
Background & Rationale
GCC, in common with many other compilers, has more than one internal representation of a program. The main ones are trees and RTL. The trees, (or formally abstract syntax trees - ASTs) are generated during parsing, and are close to the source language semantics. The RTL is generated in the back end, and is close to the generated assembly code. Ideally, the AST would contain all the semantic information of the source program.
Historically, GCC generated RTL one statement at a time, so the AST did not stay around very long. This has changed with 'function at a time' compilation (Inliner), which both C and C frontends now implement. With the AST for complete functions, and the additional semantic information they contain, the opportunity for new optimizations presents itself.
The SUIF Compiler - SUIF 2 The SUIF 2 compiler infrastructure project is co-funded by DARPA and NSF. It is a new version of the SUIF compiler system, a free infrastructure designed to support collaborative research in optimizing and parallelizing compilers. It is currently in the beta test stage of development.
Encyclopedia.com - Results for Chomsky, Noam According to transformational grammar, every intelligible sentence conforms not only to grammatical rules peculiar to its particular language, but also to "deep structures, a universal grammar underlying all languages and corresponding to an innate capacity of the human brain. Chomsky and other linguists who have built on his work have formulated transformational rules, which transform a sentence with a given grammatical structure (e.g., "John saw Mary) into a sentence with a different grammatical structure but the same essential meaning ("Mary was seen by John). Transformational linguistics has been influential in psycholinguistics, particularly in the study of language acquisition by children.
20th WCP: Chomsky and Knowledge of Language ABSTRACT: The linguistic theory of Chomsky has changed the long, traditional way of studying language. The nature of knowledge, which is closely tied to human knowledge in general, makes it a logical step for Chomsky to generalize his theory to the study of the relation between language and the world-in particular, the study of truth and reference. But his theory has been controversial and his proposal of "innate ideas" has been resisted by some empiricists who characterize him as rationalist. In our view, these empiricists make a mistake. In the present paper we attend to his position regarding linguistics as a science of mind/brain, which we believe is an important aspect of his theory that has not been paid enough attention or understood by his opponents. In turn, this will help to clarify some of the confusions around his theory. Finally we will discuss some of the debatable issues based on the outlines we draw.
xrefer - Model In Chomsky's classic transformational model of grammar (Aspects of the Theory of Syntax, 1965), a few syntactic rules in the base of the grammar provided a syntactic deep structure which was then elaborated by processes known as transformations in order to produce a surface structure. Semantics or meaning was dependent on the deep structure, while phonology or sound was dependent on the surface structure. More recently, in Knowledge of Language (1986), Chomsky has envisaged a model of grammar which is less obviously directional, and which, as with a computer program, is composed of a series of modules, each of which is fairly simple in its general workings, but which becomes complex as it interacts with other modules.