I'm struck by the typography in this proceedings, it ranges from LaTeX to typewritten papers. This year featured many fundamental advances, and the introduction of several now-basic techniques such as methods for texture filtering and anti-aliasing, and hardware implementations of polygon rasterizers. Many of the techniques are applicable to GPU today.
Modeling Waves and Surf, Darwyn R. Peachey, p. 65-74. An early approach to rendering waves and their interaction with a beach. Innovations included using particle systems to model spray when waves break. The techniques presented in this paper would translate to a lightweight GPU shader.
A Simple Model of Ocean Waves, Alan Fournier, William T. Reeves, pp. 75-84. This model is still frequently referenced today. It is a frequency based model that models the surface of the ocean as a series of stationary elliptical orbits. This model is highly applicable to the GPU.
Combining Physical and Visual Simulation - Creation of the Planet Jupiter for the Film "2010", Larry Yaegar, Craig Upson, Robert Myers. pp. 85-93. A vorticity map is used to advect a texture map representing the atmosphere of Jupiter. This is an early, but often overlooked usage of computer graphics in film.
A Fast Shaded-Polygon Renderer, Roger W. Swanson, Larry J. Thayer, pp. 95-102. I find this article a handy reference when implementing software rasterizers for polygons.
Atmospheric Illumination and Shadows, Nelson L. Max, pp. 117-124. The shadow volume algorithm of Frank Crow is turned inside out to produce god rays.
Continuous Tone Representation of Three-Dimensional Objects Illuminated by Sky Light, Tomoyuki Nishita, Eihachi Nakamae, pp. 125-132. An illumination model corresponding to the sky is developed, adjustable for such affects as direct illumination, overcast, and time of day. Good results are achieved with only single bounce global effects.
The Rendering Equation, James T. Kajiya, pp. 143-150. The classic paper introducing the lighting model that forms the basis of so many rendering systems including Renderman. The formulation states that the transport intensity of light from one surface point to another is simply the sum of the emitted light, and the total light intensity scattered towards that point from all other surface points.
Free-Form Deformation of Solid Geometric Models, Thomas W. Sederberg, Scott R. Parry, pp. 151-159. This paper introduced the notion of a deformation lattice fitted around a model to provide an easy to use manipulator or mesh modification.
Automatic Conversion of Curvilinear Wire-Frame Models to Surface Boundary Models: A Topological Approach, S. Mark Courter, John A. Brewer III, pp. 171-178. This paper presents an algorithm for deriving polygon surfaces given a set of edges connected at vertices. This paper is a very useful reference. Having implemented this algorithm once, I have to say that it is much more complicated than I expected, particularly when trying to derive topological features such as holes or tunnels.
Hairy Brushes, Steve Strassmann, pp. 225-232. A model for natural-media paint brushes takes spline segments and renders naturalistic strokes along them. A clear presentation of this concept. This paper is frequently referenced by people implementing natural media painting systems.
Snap Dragging, Eric Allan Bier, Maureen C. Stone, pp. 233-240. A significant early description of constrained snapping for user interfaces.
Filtering by Repeated Integration, Paul S. Heckbert, pp. 315-321. Heckbert generalizes Crow's summed area table method of texture filtering by integrated a kernel a few times, then sampling the result, as an alternative to complete convolution. This is due to an original idea by Ken Perlin.