Weather Visualization is a real-time generative artwork that converts live meteorological data into an evolving abstract composition. The system retrieves temperature, humidity, cloud cover, wind vectors, precipitation, and solar timing from WeatherAPI and maps these parameters to color, form, and motion on the canvas.
The project uses HSB-based color mapping to produce perceptually coherent palettes—hot climates generate red/orange hues, colder environments shift toward blue, and atmospheric conditions such as rainfall, snow, and cloud density modulate saturation and brightness.
A layered noise-driven geometry engine interprets wind speed and direction as turbulence and rotational distortion, producing fluid, painterly structures that continually regenerate. A lightweight particle system renders precipitation, and a forecast slider allows viewers to explore future weather in a temporal, visual form.
This work forms part of my broader research on AI-based environmental art, focusing on how computational processes can transform natural data into expressive, interactive visual experiences.
Languages & Libraries: JavaScript, p5.js, WeatherAPI
Data Sources: Live meteorological data (temp, wind, humidity, cloud cover, precipitation)
Data Processing: Real-time JSON parsing + environmental normalization
Visual Engine:
Noise-driven geometry
HSB-based color modulation
Wind vector–based motion
Precipitation particle system
Interaction: Cursor-driven turbulence & rotation
Tools: VSCode, GitHub, Chrome DevTools
Methods: Generative algorithms, environmental data mapping, procedural geometry