diff --git a/README.md b/README.md index 06fcfd4..167191f 100644 --- a/README.md +++ b/README.md @@ -1,373 +1,153 @@ ------------------------------------------------------------------------------- -CIS565: Project 5: WebGL +CIS 565 : Project 5 : WebGL ------------------------------------------------------------------------------- -Fall 2013 -------------------------------------------------------------------------------- -Due Friday 11/08/2013 -------------------------------------------------------------------------------- - -------------------------------------------------------------------------------- -NOTE: -------------------------------------------------------------------------------- -This project requires any graphics card with support for a modern OpenGL -pipeline. Any AMD, NVIDIA, or Intel card from the past few years should work -fine, and every machine in the SIG Lab and Moore 100 is capable of running -this project. - -This project also requires a WebGL capable browser. The project is known to -have issues with Chrome on windows, but Firefox seems to run it fine. - -------------------------------------------------------------------------------- -INTRODUCTION: -------------------------------------------------------------------------------- -In this project, you will get introduced to the world of GLSL in two parts: -vertex shading and fragment shading. The first part of this project is the -Image Processor, and the second part of this project is a Wave Vertex Shader. - -In the first part of this project, you will implement a GLSL vertex shader as -part of a WebGL demo. You will create a dynamic wave animation using code that -runs entirely on the GPU. - -In the second part of this project, you will implement a GLSL fragment shader -to render an interactive globe in WebGL. This will include texture blending, -bump mapping, specular masking, and adding a cloud layer to give your globe a -uniquie feel. - -------------------------------------------------------------------------------- -CONTENTS: -------------------------------------------------------------------------------- -The Project4 root directory contains the following subdirectories: - -* part1/ contains the base code for the Wave Vertex Shader. -* part2/ contains the base code for the Globe Fragment Shader. -* resources/ contains the screenshots found in this readme file. +#Overview -------------------------------------------------------------------------------- -PART 1 REQUIREMENTS: -------------------------------------------------------------------------------- +In this project, we aim to work with WebGL to write vertex and fragment shaders. +(Click picture to view demo.) + +##Part 1 -In Part 1, you are given code for: +###Requirements: +* Sine Wave Vertex Shader [![Sine Wave Vertex Shader](resources/sine_wave.png)](http://harmoli.github.io/Project5-WebGL/vert_wave.html) +* Simplex Wave Vertex Shader [![Simplex Wave Vertex Shader](resources/simplex_1D.png)](http://harmoli.github.io/Project5-WebGL/simplex.html) +* 2D Simplex Wave Vertex Shader [![Simplex Wave Vertex Shader](resources/simplex_2D2.png)](http://harmoli.github.io/Project5-WebGL/simplex2D.html) -* Drawing a VBO through WebGL -* Javascript code for interfacing with WebGL -* Functions for generating simplex noise +###Extra: +* Custom Simplex Wave Vertex Shader [![Simplex Wave Vertex Shader](resources/custom_wave2.png)](http://harmoli.github.io/Project5-WebGL/wave.html) +* (in development) Audio Driven Vertex Shader -You are required to implement the following: - -* A sin-wave based vertex shader: - -![Example sin wave grid](resources/sinWaveGrid.png) - -* A simplex noise based vertex shader: - -![Example simplex noise wave grid](resources/oceanWave.png) - -* One interesting vertex shader of your choice - -------------------------------------------------------------------------------- -PART 1 WALKTHROUGH: -------------------------------------------------------------------------------- -**Sin Wave** - -* For this assignment, you will need the latest version of Firefox. -* Begin by opening index.html. You should see a flat grid of black and white - lines on the xy plane: - -![Example boring grid](resources/emptyGrid.png) - -* In this assignment, you will animate the grid in a wave-like pattern using a - vertex shader, and determine each vertex’s color based on its height, as seen - in the example in the requirements. -* The vertex and fragment shader are located in script tags in `index.html`. -* The JavaScript code that needs to be modified is located in `index.js`. -* Required shader code modifications: - * Add a float uniform named u_time. - * Modify the vertex’s height using the following code: - - ```glsl - float s_contrib = sin(position.x*2.0*3.14159 + u_time); - float t_contrib = cos(position.y*2.0*3.14159 + u_time); - float height = s_contrib*t_contrib; - ``` - - * Use the GLSL mix function to blend together two colors of your choice based - on the vertex’s height. The lowest possible height should be assigned one - color (for example, `vec3(1.0, 0.2, 0.0)`) and the maximum height should be - another (`vec3(0.0, 0.8, 1.0)`). Use a varying variable to pass the color to - the fragment shader, where you will assign it `gl_FragColor`. - -* Required JavaScript code modifications: - * A floating-point time value should be increased every animation step. - Hint: the delta should be less than one. - * To pass the time to the vertex shader as a uniform, first query the location - of `u_time` using `context.getUniformLocation` in `initializeShader()`. - Then, the uniform’s value can be set by calling `context.uniform1f` in - `animate()`. - -**Simplex Wave** - -* Now that you have the sin wave working, create a new copy of `index.html`. - Call it `index_simplex.html`, or something similar. -* Open up `simplex.vert`, which contains a compact GLSL simplex noise - implementation, in a text editor. Copy and paste the functions included - inside into your `index_simplex.html`'s vertex shader. -* Try changing s_contrib and t_contrib to use simplex noise instead of sin/cos - functions with the following code: - -```glsl -vec2 simplexVec = vec2(u_time, position); -float s_contrib = snoise(simplexVec); -float t_contrib = snoise(vec2(s_contrib,u_time)); -``` - -**Wave Of Your Choice** - -* Create another copy of `index.html`. Call it `index_custom.html`, or - something similar. -* Implement your own interesting vertex shader! In your README.md with your - submission, describe your custom vertex shader, what it does, and how it - works. - -------------------------------------------------------------------------------- -PART 2 REQUIREMENTS: -------------------------------------------------------------------------------- -In Part 2, you are given code for: - -* Reading and loading textures -* Rendering a sphere with textures mapped on -* Basic passthrough fragment and vertex shaders -* A basic globe with Earth terrain color mapping -* Gamma correcting textures -* javascript to interact with the mouse - * left-click and drag moves the camera around - * right-click and drag moves the camera in and out - -You are required to implement: +##Part 2 +###Requirements: * Bump mapped terrain -* Rim lighting to simulate atmosphere -* Night-time lights on the dark side of the globe -* Specular mapping -* Moving clouds - -You are also required to pick one open-ended effect to implement: +* Rendering globe with day and night textures +* Smooth interpolation between day and night textures +* Specular mapping for water +* Rim lighting (post-process) +* Moving Clouds +###Extras: * Procedural water rendering and animation using noise -* Shade based on altitude using the height map -* Cloud shadows via ray-tracing through the cloud map in the fragment shader -* Orbiting Moon with texture mapping and shadow casting onto Earth -* Draw a skybox around the entire scene for the stars. -* Your choice! Email Liam and Patrick to get approval first - -Finally in addition to your readme, you must also set up a gh-pages branch -(explained below) to expose your beautiful WebGL globe to the world. -Some examples of what your completed globe renderer will look like: +[![Globe](resources/frag_globe.png)](http://harmoli.github.io/Project5-WebGL/index.html) -![Completed globe, day side](resources/globe_day.png) +----------- -Figure 0. Completed globe renderer, daylight side. +#Discussion -![Completed globe, twilight](resources/globe_twilight.png) +##Part 1 -Figure 1. Completed globe renderer, twilight border. +####2D Simplex Wave Vertex Shader -![Completed globe, night side](resources/globe_night.png) +Taking the simplex wave that propagated along one-axis, we decided it would be +interesting to see a simplex noise function that propagated in 2D. In this +version, we seed 2 different components of simplex noise: one using the position's +x componenet and the other using the position's y component. This produced a +very box like function. Thus, we added a final smoothing term that took +both simplex noise components into account by using the product of the previous +simplex noises as a seed against time. -Figure 2. Completed globe renderer, night side. +####Custom Simplex Wave Vertex Shader -------------------------------------------------------------------------------- -PART 2 WALKTHROUGH: -------------------------------------------------------------------------------- - -Open part2/frag_globe.html in Firefox to run it. You’ll see a globe -with Phong lighting like the one in Figure 3. All changes you need to make -will be in the fragment shader portion of this file. - -![Initial globe](resources/globe_initial.png) - -Figure 3. Initial globe with diffuse and specular lighting. - -**Night Lights** - -The backside of the globe not facing the sun is completely black in the -initial globe. Use the `diffuse` lighting component to detect if a fragment -is on this side of the globe, and, if so, shade it with the color from the -night light texture, `u_Night`. Do not abruptly switch from day to night; -instead use the `GLSL mix` function to smoothly transition from day to night -over a reasonable period. The resulting globe will look like Figure 4. -Consider brightening the night lights by multiplying the value by two. - -The base code shows an example of how to gamma correct the nighttime texture: +The custom vertex shader we have written takes a simple sine and cosine wave +that decays the father away from (0,0) the position is. Thus, we get interesting +oscilating waves that seem to propagate from (0,0) (or it just looks like an +odd-out stingray, whichever you please). -```glsl -float gammaCorrect = 1/1.2; -vec4 nightColor = pow(texture2D(u_Night, v_Texcoord), vec4(gammaCorrect)); -``` +Interestingly enough, we had originally tried to do this from the center of the +mesh; however, when trying to figure out the length from the position of the +vertex to (50, 50) (the center of the mesh), we had increasing trouble: the mesh +would not move at all or would oscillate as a parallel plane. -Feel free to play with gamma correcting the night and day textures if you -wish. Find values that you think look nice! +####Audio Drien Vertex Shader -![Day/Night without specular mapping](resources/globe_nospecmap.png) +With the introduction of HTML5, there has been a large push for audio standards +on the web. Currently, standard is Web Audio API, backed mainly by Mozilla. +The API features both ways to load local assets via AJAX/httpRequests, play/loop/etc. + files, analyze via fast fourier transform and filter audio. -Figure 4. Globe with night lights and day/night blending at dusk/dawn. +Mostly out of curiosity, we are trying to create a simple 3D audio visualizer +by taking the transformed audio data from the webpage drive the vertex shader +using such data. Libraries such as ThreeAudio.js already do this. +Currently, the biggest hurdle is data the transformed data +and hooking it up to the shader. As we would like to keep data from previous +samples around, we believe that using a ring-buffer like structure to constantly + load and offset the read will benefit the shader. -**Specular Map** +##Part 2 -Our day/night color still shows specular highlights on landmasses, which -should only be diffuse lit. Only the ocean should receive specular highlights. -Use `u_EarthSpec` to determine if a fragment is on ocean or land, and only -include the specular component if it is in ocean. +#####Procedural water rendering and animation using noise -![Day/Night with specular mapping](resources/globe_specmap.png) +In order to generate and animation "waves" in the water, perturb the normal +of the water in globe and use it as a modified bump map normal. All of this +is done in the fragment shader. First, we use the specular map to differentiate +between water and land. Since virtually none of the water has a bump texture +in the given bump map, we know that the x and y attributes of the resulting +normal calculated from the right, above and center texels will return 0. +Since we are primarily concerned with perturbing the texel-read normal in one direction +(to look as if there is a different in height), we have used simplex noise to +perturb the texel-read normal in the x direction. In order to animation the water, +we add the u_time term to add periodicity of the osciallation of the simplex +noise. -Figure 5. Globe with specular map. Compare to Figure 4. Here, the specular -component is not used when shading the land. +----------- -**Clouds** +#Performance Analysis -In day time, clouds should be diffuse lit. Use `u_Cloud` to determine the -cloud color, and `u_CloudTrans` and `mix` to determine how much a daytime -fragment is affected by the day diffuse map or cloud color. See Figure 6. +The following performance numbers are taken using Stats.js widget. It is good +to note that this is not the most accurate way to measure the performance of +WebGL, this widget gives approximate number based on the call time of the +animate function, which is called for every frame. -In night time, clouds should obscure city lights. Use `u_CloudTrans` and `mix` -to blend between the city lights and solid black. See Figure 7. +###Integrated Graphics Card +Program | FPS | ms per frame +----| ----- | ----- +Globe | 32 FPS | 32 ms +Sine Wave | 37 FPS | 27 ms +Custom Wave | 36 FPS | 28 ms +Simplex Wave | 38 FPS | 24 ms +Simplex 2D | 38 FPS | 27 ms -Animate the clouds by offseting the `s` component of `v_Texcoord` by `u_time` -when reading `u_Cloud` and `u_CloudTrans`. +###GPU Enabled +Program | FPS | ms per frame +---- | ---- | ---- +Globe | 60 FPS | 17 ms +Sine Wave | 60 FPS | 17 ms +Custom Wave | 60 FPS | 17 ms +Simplex | 60 FPS | 17 ms +Simplex 2D | 60 FPS | 17 ms -![Day with clouds](resources/globe_daycloud.png) +NOTE : All of these were taken from the github hosted versions on the +same internet connection. -Figure 6. Clouds with day time shading. +----------- -![Night with clouds](resources/globe_nightcloud.png) +#Acknowledgements -Figure 7. Clouds observing city nights on the dark side of the globe. +Much of the audio visualizer code is based off the following tutorials and +discussion of the current uses of Web Audio API: -**Bump Mapping** +![Web Audio API Analysis and Visualisation](http://chimera.labs.oreilly.com/books/1234000001552/ch05.html) -Add the appearance of mountains by perturbing the normal used for diffuse -lighting the ground (not the clouds) by using the bump map texture, `u_Bump`. -This texture is 1024x512, and is zero when the fragment is at sea-level, and -one when the fragment is on the highest mountain. Read three texels from this -texture: once using `v_Texcoord`; once one texel to the right; and once one -texel above. Create a perturbed normal in tangent space: +![A Web Audio Spectrum Analyzer](http://0xfe.blogspot.com/2011/08/web-audio-spectrum-analyzer.html) -`normalize(vec3(center - right, center - top, 0.2))` +![WebGL + WebAudio = Fun](http://wemadeyoulook.at/en/blog/webgl-webaudio-api-fun/) -Use `eastNorthUpToEyeCoordinates` to transform this normal to eye coordinates, -normalize it, then use it for diffuse lighting the ground instead of the -original normal. - -![Globe with bump mapping](resources/globe_bumpmap.png) - -Figure 8. Bump mapping brings attention to mountains. - -**Rim Lighting** - -Rim lighting is a simple post-processed lighting effect we can apply to make -the globe look as if it has an atmospheric layer catching light from the sun. -Implementing rim lighting is simple; we being by finding the dot product of -`v_Normal` and `v_Position`, and add 1 to the dot product. We call this value -our rim factor. If the rim factor is greater than 0, then we add a blue color -based on the rim factor to the current fragment color. You might use a color -something like `vec4(rim/4, rim/2, rim/2, 1)`. If our rim factor is not greater -than 0, then we leave the fragment color as is. Figures 0,1 and 2 show our -finished globe with rim lighting. - -For more information on rim lighting, -read http://www.fundza.com/rman_shaders/surface/fake_rim/fake_rim1.html. - -------------------------------------------------------------------------------- -GH-PAGES -------------------------------------------------------------------------------- -Since this assignment is in WebGL you will make your project easily viewable by -taking advantage of GitHub's project pages feature. +----------- -Once you are done you will need to create a new branch named gh-pages: +#External Libraries -`git branch gh-pages` +![Stats.js](https://github.com/mrdoob/stats.js) -Switch to your new branch: +![Web Audio API](https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html) -`git checkout gh-pages` +----------- -Create an index.html file that is either your renamed frag_globe.html or -contains a link to it, commit, and then push as usual. Now you can go to +#Miscellaneous -`.github.io/` +##GPU Specs +NVIDIA GeFore 650M -to see your beautiful globe from anywhere. - -------------------------------------------------------------------------------- -README -------------------------------------------------------------------------------- -All students must replace or augment the contents of this Readme.md in a clear -manner with the following: - -* A brief description of the project and the specific features you implemented. -* At least one screenshot of your project running. -* A 30 second or longer video of your project running. To create the video you - can use http://www.microsoft.com/expression/products/Encoder4_Overview.aspx -* A performance evaluation (described in detail below). - -------------------------------------------------------------------------------- -PERFORMANCE EVALUATION -------------------------------------------------------------------------------- -The performance evaluation is where you will investigate how to make your -program more efficient using the skills you've learned in class. You must have -performed at least one experiment on your code to investigate the positive or -negative effects on performance. - -We encourage you to get creative with your tweaks. Consider places in your code -that could be considered bottlenecks and try to improve them. - -Each student should provide no more than a one page summary of their -optimizations along with tables and or graphs to visually explain any -performance differences. - -------------------------------------------------------------------------------- -THIRD PARTY CODE POLICY -------------------------------------------------------------------------------- -* Use of any third-party code must be approved by asking on the Google groups. - If it is approved, all students are welcome to use it. Generally, we approve - use of third-party code that is not a core part of the project. For example, - for the ray tracer, we would approve using a third-party library for loading - models, but would not approve copying and pasting a CUDA function for doing - refraction. -* Third-party code must be credited in README.md. -* Using third-party code without its approval, including using another - student's code, is an academic integrity violation, and will result in you - receiving an F for the semester. - -------------------------------------------------------------------------------- -SELF-GRADING -------------------------------------------------------------------------------- -* On the submission date, email your grade, on a scale of 0 to 100, to Liam, - liamboone@gmail.com, with a one paragraph explanation. Be concise and - realistic. Recall that we reserve 30 points as a sanity check to adjust your - grade. Your actual grade will be (0.7 * your grade) + (0.3 * our grade). We - hope to only use this in extreme cases when your grade does not realistically - reflect your work - it is either too high or too low. In most cases, we plan - to give you the exact grade you suggest. -* Projects are not weighted evenly, e.g., Project 0 doesn't count as much as - the path tracer. We will determine the weighting at the end of the semester - based on the size of each project. - - ---- -SUBMISSION ---- -As with the previous project, you should fork this project and work inside of -your fork. Upon completion, commit your finished project back to your fork, and -make a pull request to the master repository. You should include a README.md -file in the root directory detailing the following - -* A brief description of the project and specific features you implemented -* At least one screenshot of your project running. -* A link to a video of your project running. -* Instructions for building and running your project if they differ from the - base code. -* A performance writeup as detailed above. -* A list of all third-party code used. -* This Readme file edited as described above in the README section. +##Integrated Card Specs +Intel HD Graphics 4000 diff --git a/part1/09 Guardian Angel.ogg b/part1/09 Guardian Angel.ogg new file mode 100644 index 0000000..468de51 Binary files /dev/null and b/part1/09 Guardian Angel.ogg differ diff --git a/part1/13 My Favorite Things.mp3 b/part1/13 My Favorite Things.mp3 new file mode 100644 index 0000000..6f969ee Binary files /dev/null and b/part1/13 My Favorite Things.mp3 differ diff --git a/part1/13 My Favorite Things.ogg b/part1/13 My Favorite Things.ogg new file mode 100644 index 0000000..e98a7cf Binary files /dev/null and b/part1/13 My Favorite Things.ogg differ diff --git a/part1/audio_visualizer.js b/part1/audio_visualizer.js new file mode 100644 index 0000000..48c33d9 --- /dev/null +++ b/part1/audio_visualizer.js @@ -0,0 +1,232 @@ +// Based on Karl Gustavsgavan's implementation of an audio visualizer. +// http://wemadeyoulook.at/en/blog/webgl-webaudio-api-fun/ +// +// Audio Visualizer + +(function() { + "use strict"; + /*global window,document,Float32Array,Uint16Array,mat4,vec3,snoise*/ + /*global getShaderSource,createWebGLContext,createProgram*/ + + var NUM_WIDTH_PTS = 128; + var NUM_HEIGHT_PTS = 128; + + var message = document.getElementById("message"); + var canvas = document.getElementById("canvas"); + var context = createWebGLContext(canvas, message); + if (!context) { + return; + } + + /////////////////////////////////////////////////////////////////////////// + + context.viewport(0, 0, canvas.width, canvas.height); + context.clearColor(1.0, 1.0, 1.0, 1.0); + context.enable(context.DEPTH_TEST); + + var persp = mat4.create(); + mat4.perspective(45.0, 0.5, 0.1, 100.0, persp); + + var eye = [2.0, 1.0, 3.0]; + var center = [0.0, 0.0, 0.0]; + var up = [0.0, 0.0, 1.0]; + var view = mat4.create(); + mat4.lookAt(eye, center, up, view); + + var positionLocation = 0; + var heightLocation = 1; + var u_modelViewPerspectiveLocation; + var u_timeLocation; + var u_time = 0; + + (function initializeShader() { + var program; + var vs = getShaderSource(document.getElementById("vs")); + var fs = getShaderSource(document.getElementById("fs")); + + var program = createProgram(context, vs, fs, message); + context.bindAttribLocation(program, positionLocation, "position"); + u_modelViewPerspectiveLocation = context.getUniformLocation(program,"u_modelViewPerspective"); + u_timeLocation = context.getUniformLocation(program, "u_time"); + + context.useProgram(program); + })(); + + var heights; + var numberOfIndices; + + (function initializeGrid() { + function uploadMesh(positions, heights, indices) { + // Positions + var positionsName = context.createBuffer(); + context.bindBuffer(context.ARRAY_BUFFER, positionsName); + context.bufferData(context.ARRAY_BUFFER, positions, context.STATIC_DRAW); + context.vertexAttribPointer(positionLocation, 2, context.FLOAT, false, 0, 0); + context.enableVertexAttribArray(positionLocation); + + if (heights) + { + // Heights + var heightsName = context.createBuffer(); + context.bindBuffer(context.ARRAY_BUFFER, heightsName); + context.bufferData(context.ARRAY_BUFFER, heights.length * heights.BYTES_PER_ELEMENT, context.STREAM_DRAW); + context.vertexAttribPointer(heightLocation, 1, context.FLOAT, false, 0, 0); + context.enableVertexAttribArray(heightLocation); + } + + // Indices + var indicesName = context.createBuffer(); + context.bindBuffer(context.ELEMENT_ARRAY_BUFFER, indicesName); + context.bufferData(context.ELEMENT_ARRAY_BUFFER, indices, context.STATIC_DRAW); + } + + var WIDTH_DIVISIONS = NUM_WIDTH_PTS - 1; + var HEIGHT_DIVISIONS = NUM_HEIGHT_PTS - 1; + + var numberOfPositions = NUM_WIDTH_PTS * NUM_HEIGHT_PTS; + + var positions = new Float32Array(2 * numberOfPositions); + var indices = new Uint16Array(2 * ((NUM_HEIGHT_PTS * (NUM_WIDTH_PTS - 1)) + (NUM_WIDTH_PTS * (NUM_HEIGHT_PTS - 1)))); + + var positionsIndex = 0; + var indicesIndex = 0; + var length; + + for (var j = 0; j < NUM_WIDTH_PTS; ++j) + { + positions[positionsIndex++] = j /(NUM_WIDTH_PTS - 1); + positions[positionsIndex++] = 0.0; + + if (j>=1) + { + length = positionsIndex / 2; + indices[indicesIndex++] = length - 2; + indices[indicesIndex++] = length - 1; + } + } + + for (var i = 0; i < HEIGHT_DIVISIONS; ++i) + { + var v = (i + 1) / (NUM_HEIGHT_PTS - 1); + positions[positionsIndex++] = 0.0; + positions[positionsIndex++] = v; + + length = (positionsIndex / 2); + indices[indicesIndex++] = length - 1; + indices[indicesIndex++] = length - 1 - NUM_WIDTH_PTS; + + for (var k = 0; k < WIDTH_DIVISIONS; ++k) + { + positions[positionsIndex++] = (k + 1) / (NUM_WIDTH_PTS - 1); + positions[positionsIndex++] = v; + + length = positionsIndex / 2; + var new_pt = length - 1; + indices[indicesIndex++] = new_pt - 1; // Previous side + indices[indicesIndex++] = new_pt; + + indices[indicesIndex++] = new_pt - NUM_WIDTH_PTS; // Previous bottom + indices[indicesIndex++] = new_pt; + } + } + + uploadMesh(positions, heights, indices); + numberOfIndices = indices.length; + })(); + + + var aud_context; + var aud_buffer; + var aud_fft; + var aud_filter; + var aud_loaded; + var src; + var counter = 0; + + (function initAudio(){ + try{ + console.log("initAudio()"); + aud_context = new AudioContext(); + loadAudio(); + }catch(e){ + alert("Browser does not support Web Audio API"); + } + })(); + + + function loadAudio(){ + var req = new XMLHttpRequest(); + req.open("GET", "13 My Favorite Things.ogg", true); + req.responseType = "arraybuffer"; + + src = aud_context.createBufferSource(); + src.connect(aud_context.destination); + req.onload = function(){ + aud_context.decodeAudioData(req.response, function onSuccess(aud_buffer){ + src.buffer = aud_buffer; + play(); + }, function onFailure(){ + alert("Decoding failed"); + }); + }; + req.send(); + } + + function play(){ + // Fast Fourier Transform + aud_fft = aud_context.createAnalyser(); + aud_fft.fftSize = 256; + + aud_filter = aud_context.createBiquadFilter(); + aud_filter.type = 0; + aud_filter.frequency.value = 220; + + src.connect(aud_filter); + aud_filter.connect(aud_fft); + aud_fft.connect(aud_context.destination); + + src.loop = true; + src.start(0); + aud_loaded = true; + //document.getElementById('loading').innerHTML=''; + } + + (function animate(){ + /////////////////////////////////////////////////////////////////////////// + // Update + + u_time = u_time + .005; + + var freqDomain = new Float32Array(aud_fft.frequencyBinCount); + aud_fft.getFloatFrequencyData(freqDomain); + for (var i = 0; i < aud_fft.frequencyBinCount; i++){ + var value = freqDomain[i]; + var percent = value / 256; + } + + var model = mat4.create(); + mat4.identity(model); + mat4.translate(model, [-0.5, -0.5, 0.0]); + var mv = mat4.create(); + mat4.multiply(view, model, mv); + var mvp = mat4.create(); + mat4.multiply(persp, mv, mvp); + + /////////////////////////////////////////////////////////////////////////// + // Render + context.clear(context.COLOR_BUFFER_BIT | context.DEPTH_BUFFER_BIT); + + context.uniformMatrix4fv(u_modelViewPerspectiveLocation, false, mvp); + context.uniform1f(u_timeLocation, u_time); + context.drawElements(context.LINES, numberOfIndices, context.UNSIGNED_SHORT,0); + + counter = counter + 1; + + function(callback){ + window.setTimeout(callback, 1000/60); + } + + window.requestAnimFrame(animate); + })(); +}()); + diff --git a/part1/index_audio.html b/part1/index_audio.html new file mode 100644 index 0000000..c02464d --- /dev/null +++ b/part1/index_audio.html @@ -0,0 +1,50 @@ + + + Audio Visualizer + + + + +
+ + + + + + + + + + + + + + diff --git a/part1/index_simplex.html b/part1/index_simplex.html new file mode 100644 index 0000000..13dd1ca --- /dev/null +++ b/part1/index_simplex.html @@ -0,0 +1,39 @@ +vec3 permute(vec3 x) { + x = ((x*34.0)+1.0)*x; + return x - floor(x * (1.0 / 289.0)) * 289.0; +} + +float simplexNoise(vec2 v) + { + const vec4 C = vec4(0.211324865405187, 0.366025403784439, -0.577350269189626, 0.024390243902439); + + vec2 i = floor(v + dot(v, C.yy) ); + vec2 x0 = v - i + dot(i, C.xx); + + vec2 i1; + i1 = (x0.x > x0.y) ? vec2(1.0, 0.0) : vec2(0.0, 1.0); + + vec4 x12 = x0.xyxy + C.xxzz; + x12.xy -= i1; + + i = i - floor(i * (1.0 / 289.0)) * 289.0; + + vec3 p = permute( permute( i.y + vec3(0.0, i1.y, 1.0 )) + + i.x + vec3(0.0, i1.x, 1.0 )); + + vec3 m = max(0.5 - vec3(dot(x0,x0), dot(x12.xy,x12.xy), dot(x12.zw,x12.zw)), 0.0); + m = m*m ; + m = m*m ; + + vec3 x = 2.0 * fract(p * C.www) - 1.0; + vec3 h = abs(x) - 0.5; + vec3 ox = floor(x + 0.5); + vec3 a0 = x - ox; + + m *= inversesqrt( a0*a0 + h*h ); + + vec3 g; + g.x = a0.x * x0.x + h.x * x0.y; + g.yz = a0.yz * x12.xz + h.yz * x12.yw; + return 130.0 * dot(m, g); +} \ No newline at end of file diff --git a/part1/simplex.html b/part1/simplex.html new file mode 100644 index 0000000..ab80180 --- /dev/null +++ b/part1/simplex.html @@ -0,0 +1,90 @@ + + + +Simplex + + + + + +
+ + + + + + + + + + + + diff --git a/part1/vert_wave.html b/part1/vert_wave.html index 57107ca..a9c8645 100644 --- a/part1/vert_wave.html +++ b/part1/vert_wave.html @@ -12,22 +12,32 @@ diff --git a/part1/vert_wave.js b/part1/vert_wave.js index b90b9cf..d22539a 100644 --- a/part1/vert_wave.js +++ b/part1/vert_wave.js @@ -31,6 +31,8 @@ var positionLocation = 0; var heightLocation = 1; var u_modelViewPerspectiveLocation; + var u_timeLocation; + var u_time = 0; (function initializeShader() { var program; @@ -40,6 +42,7 @@ var program = createProgram(context, vs, fs, message); context.bindAttribLocation(program, positionLocation, "position"); u_modelViewPerspectiveLocation = context.getUniformLocation(program,"u_modelViewPerspective"); + u_timeLocation = context.getUniformLocation(program, "u_time"); context.useProgram(program); })(); @@ -130,6 +133,8 @@ /////////////////////////////////////////////////////////////////////////// // Update + u_time = u_time + .005; + var model = mat4.create(); mat4.identity(model); mat4.translate(model, [-0.5, -0.5, 0.0]); @@ -143,6 +148,7 @@ context.clear(context.COLOR_BUFFER_BIT | context.DEPTH_BUFFER_BIT); context.uniformMatrix4fv(u_modelViewPerspectiveLocation, false, mvp); + context.uniform1f(u_timeLocation, u_time); context.drawElements(context.LINES, numberOfIndices, context.UNSIGNED_SHORT,0); window.requestAnimFrame(animate); diff --git a/part1/wave.html b/part1/wave.html new file mode 100644 index 0000000..1b3bc7a --- /dev/null +++ b/part1/wave.html @@ -0,0 +1,48 @@ + + + + `:w + Wave + + + + + +
+ + + + + + + + + + + + diff --git a/part1/wave.js b/part1/wave.js new file mode 100644 index 0000000..c4d7c38 --- /dev/null +++ b/part1/wave.js @@ -0,0 +1,157 @@ +(function() { + "use strict"; + /*global window,document,Float32Array,Uint16Array,mat4,vec3,snoise*/ + /*global getShaderSource,createWebGLContext,createProgram*/ + + var NUM_WIDTH_PTS = 100; + var NUM_HEIGHT_PTS = 100; + + var message = document.getElementById("message"); + var canvas = document.getElementById("canvas"); + var context = createWebGLContext(canvas, message); + if (!context) { + return; + } + + /////////////////////////////////////////////////////////////////////////// + + context.viewport(0, 0, canvas.width, canvas.height); + context.clearColor(1.0, 1.0, 1.0, 1.0); + context.enable(context.DEPTH_TEST); + + var persp = mat4.create(); + mat4.perspective(45.0, 0.5, 0.1, 100.0, persp); + + var eye = [2.0, 1.0, 3.0]; + var center = [0.0, 0.0, 0.0]; + var up = [0.0, 0.0, 1.0]; + var view = mat4.create(); + mat4.lookAt(eye, center, up, view); + + var positionLocation = 0; + var heightLocation = 1; + var u_modelViewPerspectiveLocation; + var u_timeLocation; + var u_time = 0; + + (function initializeShader() { + var program; + var vs = getShaderSource(document.getElementById("vs")); + var fs = getShaderSource(document.getElementById("fs")); + + var program = createProgram(context, vs, fs, message); + context.bindAttribLocation(program, positionLocation, "position"); + u_modelViewPerspectiveLocation = context.getUniformLocation(program,"u_modelViewPerspective"); + u_timeLocation = context.getUniformLocation(program, "u_time"); + + context.useProgram(program); + })(); + + var heights; + var numberOfIndices; + + (function initializeGrid() { + function uploadMesh(positions, heights, indices) { + // Positions + var positionsName = context.createBuffer(); + context.bindBuffer(context.ARRAY_BUFFER, positionsName); + context.bufferData(context.ARRAY_BUFFER, positions, context.STATIC_DRAW); + context.vertexAttribPointer(positionLocation, 2, context.FLOAT, false, 0, 0); + context.enableVertexAttribArray(positionLocation); + + if (heights) + { + // Heights + var heightsName = context.createBuffer(); + context.bindBuffer(context.ARRAY_BUFFER, heightsName); + context.bufferData(context.ARRAY_BUFFER, heights.length * heights.BYTES_PER_ELEMENT, context.STREAM_DRAW); + context.vertexAttribPointer(heightLocation, 1, context.FLOAT, false, 0, 0); + context.enableVertexAttribArray(heightLocation); + } + + // Indices + var indicesName = context.createBuffer(); + context.bindBuffer(context.ELEMENT_ARRAY_BUFFER, indicesName); + context.bufferData(context.ELEMENT_ARRAY_BUFFER, indices, context.STATIC_DRAW); + } + + var WIDTH_DIVISIONS = NUM_WIDTH_PTS - 1; + var HEIGHT_DIVISIONS = NUM_HEIGHT_PTS - 1; + + var numberOfPositions = NUM_WIDTH_PTS * NUM_HEIGHT_PTS; + + var positions = new Float32Array(2 * numberOfPositions); + var indices = new Uint16Array(2 * ((NUM_HEIGHT_PTS * (NUM_WIDTH_PTS - 1)) + (NUM_WIDTH_PTS * (NUM_HEIGHT_PTS - 1)))); + + var positionsIndex = 0; + var indicesIndex = 0; + var length; + + for (var j = 0; j < NUM_WIDTH_PTS; ++j) + { + positions[positionsIndex++] = j /(NUM_WIDTH_PTS - 1); + positions[positionsIndex++] = 0.0; + + if (j>=1) + { + length = positionsIndex / 2; + indices[indicesIndex++] = length - 2; + indices[indicesIndex++] = length - 1; + } + } + + for (var i = 0; i < HEIGHT_DIVISIONS; ++i) + { + var v = (i + 1) / (NUM_HEIGHT_PTS - 1); + positions[positionsIndex++] = 0.0; + positions[positionsIndex++] = v; + + length = (positionsIndex / 2); + indices[indicesIndex++] = length - 1; + indices[indicesIndex++] = length - 1 - NUM_WIDTH_PTS; + + for (var k = 0; k < WIDTH_DIVISIONS; ++k) + { + positions[positionsIndex++] = (k + 1) / (NUM_WIDTH_PTS - 1); + positions[positionsIndex++] = v; + + length = positionsIndex / 2; + var new_pt = length - 1; + indices[indicesIndex++] = new_pt - 1; // Previous side + indices[indicesIndex++] = new_pt; + + indices[indicesIndex++] = new_pt - NUM_WIDTH_PTS; // Previous bottom + indices[indicesIndex++] = new_pt; + } + } + + uploadMesh(positions, heights, indices); + numberOfIndices = indices.length; + })(); + + (function animate(){ + /////////////////////////////////////////////////////////////////////////// + // Update + + u_time = u_time + .001; + + var model = mat4.create(); + mat4.identity(model); + mat4.translate(model, [-0.5, -0.5, 0.0]); + var mv = mat4.create(); + mat4.multiply(view, model, mv); + var mvp = mat4.create(); + mat4.multiply(persp, mv, mvp); + + /////////////////////////////////////////////////////////////////////////// + // Render + context.clear(context.COLOR_BUFFER_BIT | context.DEPTH_BUFFER_BIT); + + context.uniformMatrix4fv(u_modelViewPerspectiveLocation, false, mvp); + context.uniform1f(u_timeLocation, u_time); + context.drawElements(context.LINES, numberOfIndices, context.UNSIGNED_SHORT,0); + + window.requestAnimFrame(animate); + })(); + +}()); diff --git a/part2/frag_globe.html b/part2/frag_globe.html index 6aa5609..2ad964f 100644 --- a/part2/frag_globe.html +++ b/part2/frag_globe.html @@ -9,7 +9,6 @@
- + + diff --git a/part2/frag_globe.js b/part2/frag_globe.js index 1d8a877..5262464 100644 --- a/part2/frag_globe.js +++ b/part2/frag_globe.js @@ -14,6 +14,16 @@ var NUM_WIDTH_PTS = 64; var NUM_HEIGHT_PTS = 64; + var stats = new Stats(); + var stats_ms = new Stats(); + stats_ms.setMode(1); + stats.domElement.style.position = 'absolute'; + stats.domElement.style.top = '10px'; + document.body.appendChild( stats.domElement ); + stats_ms.domElement.style.position = 'absolute'; + stats_ms.domElement.style.top = '60px'; + document.body.appendChild( stats_ms.domElement ); + var message = document.getElementById("message"); var canvas = document.getElementById("canvas"); var gl = createWebGLContext(canvas, message); @@ -56,6 +66,8 @@ var u_BumpLocation; var u_timeLocation; + var u_time = 0; + (function initializeShader() { var vs = getShaderSource(document.getElementById("vs")); var fs = getShaderSource(document.getElementById("fs")); @@ -265,8 +277,8 @@ gl.uniformMatrix4fv(u_ViewLocation, false, view); gl.uniformMatrix4fv(u_PerspLocation, false, persp); gl.uniformMatrix4fv(u_InvTransLocation, false, invTrans); - gl.uniform3fv(u_CameraSpaceDirLightLocation, lightdir); + gl.uniform1f(u_timeLocation, .07 * time); gl.activeTexture(gl.TEXTURE0); gl.bindTexture(gl.TEXTURE_2D, dayTex); @@ -290,6 +302,8 @@ time += 0.001; window.requestAnimFrame(animate); + stats.update(); + stats_ms.update(); } var textureCount = 0; diff --git a/part2/noise3D.js b/part2/noise3D.js new file mode 100644 index 0000000..d519f47 --- /dev/null +++ b/part2/noise3D.js @@ -0,0 +1,316 @@ +(function(exports) { + "use strict"; + /*global window,vec3*/ + + exports = exports || window; + + function step(edge, x) { + return [ + (x[0] < edge[0]) ? 0.0 : 1.0, + (x[1] < edge[1]) ? 0.0 : 1.0, + (x[2] < edge[2]) ? 0.0 : 1.0 + ]; + } + + function step_vec4(edge, x) { + return [ + (x[0] < edge[0]) ? 0.0 : 1.0, + (x[1] < edge[1]) ? 0.0 : 1.0, + (x[2] < edge[2]) ? 0.0 : 1.0, + (x[3] < edge[3]) ? 0.0 : 1.0 + ]; + } + + function min(x, y) { + return [ + y[0] < x[0] ? y[0] : x[0], + y[1] < x[1] ? y[1] : x[1], + y[2] < x[2] ? y[2] : x[2] + ]; + } + + function max(x, y) { + return [ + y[0] > x[0] ? y[0] : x[0], + y[1] > x[1] ? y[1] : x[1], + y[2] > x[2] ? y[2] : x[2] + ]; + } + + function max_vec4(x, y) { + return [ + y[0] > x[0] ? y[0] : x[0], + y[1] > x[1] ? y[1] : x[1], + y[2] > x[2] ? y[2] : x[2], + y[3] > x[3] ? y[3] : x[3] + ]; + } + + function vec4_dot(left, right) { + return left[0] * right[0] + + left[1] * right[1] + + left[2] * right[2] + + left[3] * right[3]; + } + + // + // Description : Array and textureless GLSL 2D/3D/4D simplex + // noise functions. + // Author : Ian McEwan, Ashima Arts. + // Maintainer : ijm + // Lastmod : 20110822 (ijm) + // License : Copyright (C) 2011 Ashima Arts. All rights reserved. + // Distributed under the MIT License. See LICENSE file. + // https://github.com/ashima/webgl-noise + // + function mod289_vec3(x) { + var temp = (1.0 / 289.0); + return [ + x[0] - Math.floor(x[0] * temp) * 289.0, + x[1] - Math.floor(x[1] * temp) * 289.0, + x[2] - Math.floor(x[2] * temp) * 289.0 + ]; + } + + function mod289_vec4(x) { + var temp = (1.0 / 289.0); + return [ + x[0] - Math.floor(x[0] * temp) * 289.0, + x[1] - Math.floor(x[1] * temp) * 289.0, + x[2] - Math.floor(x[2] * temp) * 289.0, + x[3] - Math.floor(x[3] * temp) * 289.0 + ]; + } + + function permute_vec4(x) { + return mod289_vec4([ + ((x[0]*34.0)+1.0)*x[0], + ((x[1]*34.0)+1.0)*x[1], + ((x[2]*34.0)+1.0)*x[2], + ((x[3]*34.0)+1.0)*x[3] + ]); + } + + function taylorInvSqrt_vec4(r) { + return [ + 1.79284291400159 - 0.85373472095314 * r[0], + 1.79284291400159 - 0.85373472095314 * r[1], + 1.79284291400159 - 0.85373472095314 * r[2], + 1.79284291400159 - 0.85373472095314 * r[3] + ]; + } + + exports.snoise = function(v) { + // const vec2 C = vec2(1.0f/6.0f, 1.0f/3.0f) ; + // const vec4 D = vec4(0.0f, 0.5f, 1.0f, 2.0f); + var C = [1.0/6.0, 1.0/3.0]; + var D = [0.0, 0.5, 1.0, 2.0]; + + // vec3 i = floor(v + dot(v, vec3(C.y, C.y, C.y)) ); + // vec3 x0 = v - i + dot(i, vec3(C.x, C.x, C.x) ); + var temp0 = vec3.create(); + var temp3 = vec3.dot(v, [C[1], C[1], C[1]]); + vec3.add(v, [temp3, temp3, temp3], temp0); + var i = [Math.floor(temp0[0]), Math.floor(temp0[1]), Math.floor(temp0[2])]; + var temp1 = vec3.create(); + vec3.subtract(v, i, temp1); + var temp2 = vec3.dot(i, [C[0], C[0], C[0]]); + var x0 = vec3.create(); + vec3.add(temp1, [temp2, temp2, temp2], x0); + + // vec3 g = step(vec3(x0.y, x0.z, x0.x), vec3(x0.x, x0.y, x0.z)); + // vec3 l = 1.0f - g; + // vec3 i1 = min( vec3(g.x, g.y, g.z), vec3(l.z, l.x, l.y) ); + // vec3 i2 = max( vec3(g.x, g.y, g.z), vec3(l.z, l.x, l.y) ); + var g = step([x0[1], x0[2], x0[0]], [x0[0], x0[1], x0[2]]); + var l = [1.0 - g[0], 1.0 - g[1], 1.0 - g[2]]; + var i1 = min([g[0], g[1], g[2]], [l[2], l[0], l[1]]); + var i2 = max([g[0], g[1], g[2]], [l[2], l[0], l[1]]); + + // vec3 x1 = x0 - i1 + vec3(C.x, C.x, C.x); + // vec3 x2 = x0 - i2 + vec3(C.y, C.y, C.y); // 2.0*C.x = 1/3 = C.y + // vec3 x3 = x0 - vec3(D.y, D.y, D.y); // -1.0+3.0*C.x = -0.5 = -D.y + var temp4 = vec3.create(); + vec3.subtract(x0, i1, temp4); + var x1 = vec3.create(); + vec3.add(temp4, [C[0], C[0], C[0]], x1); + var temp5 = vec3.create(); + vec3.subtract(x0, i2, temp5); + var x2 = vec3.create(); + vec3.add(temp5, [C[1], C[1], C[1]], x2); + var x3 = vec3.create(); + vec3.subtract(x0, [D[1], D[1], D[1]], x3); + + // i = mod289(i); + // vec4 p = permute( permute( permute( + // i.z + vec4(0.0, i1.z, i2.z, 1.0 )) + // + i.y + vec4(0.0, i1.y, i2.y, 1.0 )) + // + i.x + vec4(0.0, i1.x, i2.x, 1.0 )); + i = mod289_vec3(i); + var p = permute_vec4([i[2] + 0.0, i[2] + i1[2], i[2] + i2[2], i[2] + 1.0]); + p[0] += i[1] + 0.0; + p[1] += i[1] + i1[1]; + p[2] += i[1] + i2[1]; + p[3] += i[1] + 1.0; + p = permute_vec4(p); + p[0] += i[0] + 0.0; + p[1] += i[0] + i1[0]; + p[2] += i[0] + i2[0]; + p[3] += i[0] + 1.0; + p = permute_vec4(p); + + // float n_ = 0.142857142857f; // 1.0/7.0 + // vec3 ns = n_ * vec3(D.w, D.y, D.z) - vec3(D.x, D.z, D.x); +// var n_ = 0.142857142857; // 1.0/7.0 +// var ns = [ +// n_ * D[3] - D[0], +// n_ * D[1] - D[2], +// n_ * D[2] - D[0] +// ]; + var ns = [ + 0.28571430, + -0.92857140, + 0.14285715 + ]; + + // vec4 j = p - 49.0f * floor(p * ns.z * ns.z); // mod(p,7*7) + var j = [ + p[0] - 49.0 * Math.floor(p[0] * ns[2] * ns[2]), + p[1] - 49.0 * Math.floor(p[1] * ns[2] * ns[2]), + p[2] - 49.0 * Math.floor(p[2] * ns[2] * ns[2]), + p[3] - 49.0 * Math.floor(p[3] * ns[2] * ns[2]) + ]; + + // vec4 x_ = floor(j * ns.z); + // vec4 y_ = floor(j - 7.0f * x_ ); // mod(j,N) + var x_ = [ + Math.floor(j[0] * ns[2]), + Math.floor(j[1] * ns[2]), + Math.floor(j[2] * ns[2]), + Math.floor(j[3] * ns[2]) + ]; + var y_ = [ + Math.floor(j[0] - 7.0 * x_[0] ), + Math.floor(j[1] - 7.0 * x_[1] ), + Math.floor(j[2] - 7.0 * x_[2] ), + Math.floor(j[3] - 7.0 * x_[3] ) + ]; + + // vec4 x = x_ *ns.x + vec4(ns.y, ns.y, ns.y, ns.y); + // vec4 y = y_ *ns.x + vec4(ns.y, ns.y, ns.y, ns.y); + // vec4 h = 1.0f - abs(x) - abs(y); + var x = [ + x_[0] *ns[0] + ns[1], + x_[1] *ns[0] + ns[1], + x_[2] *ns[0] + ns[1], + x_[3] *ns[0] + ns[1] + ]; + var y = [ + y_[0] *ns[0] + ns[1], + y_[1] *ns[0] + ns[1], + y_[2] *ns[0] + ns[1], + y_[3] *ns[0] + ns[1] + ]; + var h = [ + 1.0 - Math.abs(x[0]) - Math.abs(y[0]), + 1.0 - Math.abs(x[1]) - Math.abs(y[1]), + 1.0 - Math.abs(x[2]) - Math.abs(y[2]), + 1.0 - Math.abs(x[3]) - Math.abs(y[3]) + ]; + + // vec4 b0 = vec4( vec2(x.x, x.y), vec2(y.x, y.y) ); + // vec4 b1 = vec4( vec2(x.z, x.w), vec2(y.z, y.w) ); + var b0 = [x[0], x[1], y[0], y[1]]; + var b1 = [x[2], x[3], y[2], y[3]]; + + // vec4 s0 = floor(b0)*2.0f + 1.0f; + // vec4 s1 = floor(b1)*2.0f + 1.0f; + // vec4 sh = -step(h, vec4(0.0)); + + var s0 = [ + Math.floor(b0[0])*2.0 + 1.0, + Math.floor(b0[1])*2.0 + 1.0, + Math.floor(b0[2])*2.0 + 1.0, + Math.floor(b0[3])*2.0 + 1.0 + ]; + var s1 = [ + Math.floor(b1[0])*2.0 + 1.0, + Math.floor(b1[1])*2.0 + 1.0, + Math.floor(b1[2])*2.0 + 1.0, + Math.floor(b1[3])*2.0 + 1.0 + ]; + var sh = step_vec4(h, [0.0, 0.0, 0.0, 0.0]); + sh[0] = -sh[0]; + sh[1] = -sh[1]; + sh[2] = -sh[2]; + sh[3] = -sh[3]; + + // vec4 a0 = vec4(b0.x, b0.z, b0.y, b0.w) + vec4(s0.x, s0.z, s0.y, s0.w) * vec4(sh.x, sh.x, sh.y, sh.y) ; + // vec4 a1 = vec4(b1.x, b1.z, b1.y, b1.w) + vec4(s1.x, s1.z, s1.y, s1.w) * vec4(sh.z, sh.z, sh.w, sh.w) ; + var a0 = [ + b0[0] + s0[0] * sh[0], + b0[2] + s0[2] * sh[0], + b0[1] + s0[1] * sh[1], + b0[3] + s0[3] * sh[1] + ]; + var a1 = [ + b1[0] + s1[0] * sh[2], + b1[2] + s1[2] * sh[2], + b1[1] + s1[1] * sh[3], + b1[3] + s1[3] * sh[3] + ]; + + // vec3 p0 = vec3(a0.x, a0.y, h.x); + // vec3 p1 = vec3(a0.z, a0.w, h.y); + // vec3 p2 = vec3(a1.x, a1.y, h.z); + // vec3 p3 = vec3(a1.z, a1.w, h.w); + var p0 = [a0[0], a0[1], h[0]]; + var p1 = [a0[2], a0[3], h[1]]; + var p2 = [a1[0], a1[1], h[2]]; + var p3 = [a1[2], a1[3], h[3]]; + + // vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3))); + // p0 *= norm.x; + // p1 *= norm.y; + // p2 *= norm.z; + // p3 *= norm.w; + var norm = taylorInvSqrt_vec4([vec3.dot(p0,p0), vec3.dot(p1,p1), vec3.dot(p2, p2), vec3.dot(p3,p3)]); + p0 = [p0[0]*norm[0], p0[1]*norm[0], p0[2]*norm[0]]; + p1 = [p1[0]*norm[1], p1[1]*norm[1], p1[2]*norm[1]]; + p2 = [p2[0]*norm[2], p2[1]*norm[2], p2[2]*norm[2]]; + p3 = [p3[0]*norm[3], p3[1]*norm[3], p3[2]*norm[3]]; + + // vec4 m = max(0.6f - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0); + // m = m * m; + var m = max_vec4([ + 0.6 - vec3.dot(x0,x0), + 0.6 - vec3.dot(x1,x1), + 0.6 - vec3.dot(x2,x2), + 0.6 - vec3.dot(x3,x3) + ], [ + 0.0, + 0.0, + 0.0, + 0.0 + ]); + m[0] *= m[0]; + m[1] *= m[1]; + m[2] *= m[2]; + m[3] *= m[3]; + + // return 42.0f * dot( m*m, vec4( dot(p0,x0), dot(p1,x1), + // dot(p2,x2), dot(p3,x3) ) ); + return 42.0 * vec4_dot([ + m[0] * m[0], + m[1] * m[1], + m[2] * m[2], + m[3] * m[3] + ], [ + vec3.dot(p0,x0), + vec3.dot(p1,x1), + vec3.dot(p2,x2), + vec3.dot(p3,x3) + ]); + }; + +}()); diff --git a/part2/stats.min.js b/part2/stats.min.js new file mode 100644 index 0000000..8195d09 --- /dev/null +++ b/part2/stats.min.js @@ -0,0 +1,5 @@ +Stats=function(){var l=Date.now(),m=l,g=0,n=Infinity,o=0,h=0,p=Infinity,q=0,r=0,s=0,f=document.createElement("div");f.id="stats";f.addEventListener("mousedown",function(b){b.preventDefault();t(++s%2)},!1);f.style.cssText="width:80px;opacity:0.9;cursor:pointer";var a=document.createElement("div");a.id="fps";a.style.cssText="padding:0 0 3px 3px;text-align:left;background-color:#002";f.appendChild(a);var i=document.createElement("div");i.id="fpsText";i.style.cssText="color:#0ff;font-family:Helvetica,Arial,sans-serif;font-size:9px;font-weight:bold;line-height:15px"; + i.innerHTML="FPS";a.appendChild(i);var c=document.createElement("div");c.id="fpsGraph";c.style.cssText="position:relative;width:74px;height:30px;background-color:#0ff";for(a.appendChild(c);74>c.children.length;){var j=document.createElement("span");j.style.cssText="width:1px;height:30px;float:left;background-color:#113";c.appendChild(j)}var d=document.createElement("div");d.id="ms";d.style.cssText="padding:0 0 3px 3px;text-align:left;background-color:#020;display:none";f.appendChild(d);var k=document.createElement("div"); + k.id="msText";k.style.cssText="color:#0f0;font-family:Helvetica,Arial,sans-serif;font-size:9px;font-weight:bold;line-height:15px";k.innerHTML="MS";d.appendChild(k);var e=document.createElement("div");e.id="msGraph";e.style.cssText="position:relative;width:74px;height:30px;background-color:#0f0";for(d.appendChild(e);74>e.children.length;)j=document.createElement("span"),j.style.cssText="width:1px;height:30px;float:left;background-color:#131",e.appendChild(j);var t=function(b){s=b;switch(s){case 0:a.style.display= + "block";d.style.display="none";break;case 1:a.style.display="none",d.style.display="block"}};return{REVISION:11,domElement:f,setMode:t,begin:function(){l=Date.now()},end:function(){var b=Date.now();g=b-l;n=Math.min(n,g);o=Math.max(o,g);k.textContent=g+" MS ("+n+"-"+o+")";var a=Math.min(30,30-30*(g/200));e.appendChild(e.firstChild).style.height=a+"px";r++;b>m+1E3&&(h=Math.round(1E3*r/(b-m)),p=Math.min(p,h),q=Math.max(q,h),i.textContent=h+" FPS ("+p+"-"+q+")",a=Math.min(30,30-30*(h/100)),c.appendChild(c.firstChild).style.height= + a+"px",m=b,r=0);return b},update:function(){l=this.end()}}}; diff --git a/resources/custom_wave1.png b/resources/custom_wave1.png new file mode 100644 index 0000000..0c00b79 Binary files /dev/null and b/resources/custom_wave1.png differ diff --git a/resources/custom_wave2.png b/resources/custom_wave2.png new file mode 100644 index 0000000..4c2fca1 Binary files /dev/null and b/resources/custom_wave2.png differ diff --git a/resources/emptyGrid.png b/resources/emptyGrid.png deleted file mode 100644 index 2ee870f..0000000 Binary files a/resources/emptyGrid.png and /dev/null differ diff --git a/resources/frag_globe.png b/resources/frag_globe.png new file mode 100644 index 0000000..c09362a Binary files /dev/null and b/resources/frag_globe.png differ diff --git a/resources/globe_bumpmap.png b/resources/globe_bumpmap.png deleted file mode 100644 index fa91a9f..0000000 Binary files a/resources/globe_bumpmap.png and /dev/null differ diff --git a/resources/globe_day.png b/resources/globe_day.png deleted file mode 100644 index e3cbf1f..0000000 Binary files a/resources/globe_day.png and /dev/null differ diff --git a/resources/globe_daycloud.png b/resources/globe_daycloud.png deleted file mode 100644 index ff00096..0000000 Binary files a/resources/globe_daycloud.png and /dev/null differ diff --git a/resources/globe_initial.png b/resources/globe_initial.png deleted file mode 100644 index 1e3bde5..0000000 Binary files a/resources/globe_initial.png and /dev/null differ diff --git a/resources/globe_night.png b/resources/globe_night.png deleted file mode 100644 index 6401768..0000000 Binary files a/resources/globe_night.png and /dev/null differ diff --git a/resources/globe_nightcloud.png b/resources/globe_nightcloud.png deleted file mode 100644 index 781aec0..0000000 Binary files a/resources/globe_nightcloud.png and /dev/null differ diff --git a/resources/globe_nospecmap.png b/resources/globe_nospecmap.png deleted file mode 100644 index c370735..0000000 Binary files a/resources/globe_nospecmap.png and /dev/null differ diff --git a/resources/globe_specmap.png b/resources/globe_specmap.png deleted file mode 100644 index 7ff01a5..0000000 Binary files a/resources/globe_specmap.png and /dev/null differ diff --git a/resources/globe_twilight.png b/resources/globe_twilight.png deleted file mode 100644 index ac5ea5c..0000000 Binary files a/resources/globe_twilight.png and /dev/null differ diff --git a/resources/oceanWave.png b/resources/oceanWave.png deleted file mode 100644 index 73b65d5..0000000 Binary files a/resources/oceanWave.png and /dev/null differ diff --git a/resources/simplex_1D.png b/resources/simplex_1D.png new file mode 100644 index 0000000..303d2bb Binary files /dev/null and b/resources/simplex_1D.png differ diff --git a/resources/simplex_2D1.png b/resources/simplex_2D1.png new file mode 100644 index 0000000..b20f0a5 Binary files /dev/null and b/resources/simplex_2D1.png differ diff --git a/resources/simplex_2D2.png b/resources/simplex_2D2.png new file mode 100644 index 0000000..e2935cd Binary files /dev/null and b/resources/simplex_2D2.png differ diff --git a/resources/sinWaveGrid.png b/resources/sinWaveGrid.png deleted file mode 100644 index 733f8d8..0000000 Binary files a/resources/sinWaveGrid.png and /dev/null differ diff --git a/resources/sine_wave.png b/resources/sine_wave.png new file mode 100644 index 0000000..072146c Binary files /dev/null and b/resources/sine_wave.png differ