Interactive Blob Reacting to Microphone Sound Using HTML, CSS, and JS

Answered using ChatGPT 4o by ChatGPT4o

Question:

Please write a html, css, js code that shows a blob; an interactive blob that reacted to the microphone sound.

Answer:

Creating an interactive blob that reacts to microphone sound involves several steps. Below is a simple example using HTML, CSS, and JavaScript. This example uses the Web Audio API to capture microphone input and the Canvas API to draw and animate the blob.

HTML

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Interactive Blob</title>
    <link rel="stylesheet" href="styles.css">
</head>
<body>
    <canvas id="blobCanvas"></canvas>
    <script src="script.js"></script>
</body>
</html>

CSS (styles.css)

body, html {
    margin: 0;
    padding: 0;
    overflow: hidden;
    display: flex;
    justify-content: center;
    align-items: center;
    height: 100vh;
    background-color: #282c34;
}

canvas {
    display: block;
}

JavaScript (script.js)

const canvas = document.getElementById('blobCanvas');
const ctx = canvas.getContext('2d');
canvas.width = window.innerWidth;
canvas.height = window.innerHeight;

let audioContext;
let analyser;
let dataArray;
let bufferLength;

function initAudio() {
    navigator.mediaDevices.getUserMedia({ audio: true })
        .then(stream => {
            audioContext = new (window.AudioContext || window.webkitAudioContext)();
            const source = audioContext.createMediaStreamSource(stream);
            analyser = audioContext.createAnalyser();
            source.connect(analyser);
            analyser.fftSize = 256;
            bufferLength = analyser.frequencyBinCount;
            dataArray = new Uint8Array(bufferLength);
            animate();
        })
        .catch(err => {
            console.error('Error accessing microphone:', err);
        });
}

function drawBlob() {
    analyser.getByteFrequencyData(dataArray);
    const radius = Math.max(...dataArray) / 2;
    const centerX = canvas.width / 2;
    const centerY = canvas.height / 2;

    ctx.clearRect(0, 0, canvas.width, canvas.height);
    ctx.beginPath();
    ctx.moveTo(centerX + radius, centerY);

    for (let i = 0; i <= 360; i += 10) {
        const angle = i * Math.PI / 180;
        const x = centerX + radius * Math.cos(angle);
        const y = centerY + radius * Math.sin(angle);
        ctx.lineTo(x, y);
    }

    ctx.closePath();
    ctx.fillStyle = '#61dafb';
    ctx.fill();
}

function animate() {
    requestAnimationFrame(animate);
    drawBlob();
}

initAudio();

Explanation

  1. HTML: The HTML file includes a canvas element where the blob will be drawn.
  2. CSS: The CSS file styles the canvas to cover the entire viewport and centers it.
  3. JavaScript:
    • Audio Initialization: The initAudio function requests access to the user's microphone and sets up the Web Audio API to analyze the audio data.
    • Blob Drawing: The drawBlob function uses the frequency data from the microphone to determine the size of the blob. It draws a circle that changes size based on the loudest frequency.
    • Animation: The animate function continuously updates the canvas by calling drawBlob and using requestAnimationFrame for smooth animation.

This code provides a basic interactive blob that reacts to microphone input. You can further enhance it by adding more complex shapes, smoothing the animation, or adding more visual effects.