Stream camera output to a video element in React
From a React web application, it is possible to access the camera. The browser support for that is very good. Our objective is to show the camera output in a video element.
Create a video element and attach a ref to it.
<video ref={videoRef} autoPlay />
When the component mounts, in the useEffect hook, get the camera stream like so:
navigator.mediaDevices.getUserMedia({ video: true })
.then((stream) => {});
Attach the camera stream to the video element.
videoRef.current.srcObject = stream;
With these changes, the camera stream is available in the video element. The complete source code is shown below:
import { useRef, useEffect } from "react";
export default function App() {
const videoRef = useRef();
useEffect(() => {
navigator.mediaDevices.getUserMedia({ video: true }).then((stream) => {
videoRef.current.srcObject = stream;
});
}, []);
return (
<div className="App">
<video ref={videoRef} autoPlay />
</div>
);
}
The code is also available in a CodeSandbox.
Progressive web apps are apps that access native capabilities. In this case, we access the camera. So, our web app can be made into a progressive web app. To make an app into a progressive web app, we have to provide an application manifest and use a service worker. I have written more about progressive web apps in another blog post.