Build a Camera Preview Web Component using Stencil.js for Barcode Scanning
Web Components are a set of features that provide a standard component model for the Web allowing for encapsulation and interoperability of individual HTML elements. 1 We can define a custom element to use in any HTML environment, with any framework, or with no framework at all.
Stencil is a compiler that generates Web Components. We can use TypeScript, JSX, and CSS to create standards-based Web Components with it. 2
In this article, we are going to build a camera preview web component using Stencil and write a barcode scanning demo using it.
Build a Camera Preview Web Component using Stencil.js
Start a New Project
Run the following command:
npm init stencil
You will be prompted to create a component project or an app project. Here, we choose component.
? Select a starter project.
Starters marked as [community] are developed by the Stencil
Community, rather than Ionic. For more information on the
Stencil Community, please see github.com/stencil-community
› - Use arrow-keys. Return to submit.
❯ component Collection of web components that can be
used anywhere
app [community] Minimal starter for building a Stencil
app or website
Generate a Camera Preview Component
In the project, create a new component named camera-preview
.
npx stencil g camera-preview
To test the component, we can modify the src\index.html
:
- <my-component first="Stencil" last="'Don't call me a framework' JS"></my-component>
+ <camera-preview>
+ <p>Inner elements</p>
+ </camera-preview>
Then run the following command to test it:
npm run start
Use getUserMedia to Open the Camera
Let’s modify the component so that it can open the camera.
-
Add a video element in the render function:
render() { return ( <div class="camera-container full"> <video class="camera full" ref={(el) => this.camera = el as HTMLVideoElement} onLoadedData={()=>this.onCameraOpened()} muted autoplay="autoplay" playsinline="playsinline" webkit-playsinline></video> <slot/> </div> ); }
The CSS:
.camera-container { position: relative; } .camera { position: absolute; object-fit: cover; } .full { width:100%; height:100%; left:0; top:0; }
The video element fills the container.
-
Add two props:
desiredCamera
anddesiredResolution
:@Prop() desiredResolution?: Resolution; @Prop() desiredCamera?: string;
Create a new
definition.ts
file and define a Resolution interface:export interface Resolution{ width:number; height:number; }
-
Load the list of existing camera devices:
devices!: MediaDeviceInfo[]; async loadDevices(){ const constraints = {video: true, audio: false}; const stream = await navigator.mediaDevices.getUserMedia(constraints) // ask for permission const devices = await navigator.mediaDevices.enumerateDevices(); let cameraDevices:MediaDeviceInfo[] = []; for (let i=0;i<devices.length;i++){ let device = devices[i]; if (device.kind == 'videoinput'){ // filter out audio devices cameraDevices.push(device); } } this.devices = cameraDevices; const tracks = stream.getTracks(); for (let i=0;i<tracks.length;i++) { const track = tracks[i]; track.stop(); // stop the opened camera } context.emit("devicesLoaded", devices); console.log(this.devices); }
-
When the component is loaded, open the camera device whose name contains the
desiredCamera
or open the first device. If thedesiredResolution
prop exists, configure the resolution as well.componentDidLoad(){ this.playWithDesired(); } async playWithDesired(){ if (!this.devices) { await this.loadDevices(); // load the camera devices list if it hasn't been loaded } let desiredDevice:string|null = this.getDesiredDevice(this.devices) if (desiredDevice) { if (this.desiredResolution) { this.play(desiredDevice, this.desiredResolution); }else{ this.play(desiredDevice); } }else{ throw new Error("No camera detected"); } } getDesiredDevice(devices:MediaDeviceInfo[]){ var count = 0; var desiredIndex = 0; for (var i=0;i<devices.length;i++){ var device = devices[i]; var label = device.label || `Camera ${count++}`; if (this.desiredCamera) { if (label.toLowerCase().indexOf(this.desiredCamera.toLowerCase()) != -1) { desiredIndex = i; break; } } } if (devices.length>0) { return devices[desiredIndex].deviceId; // return the device id }else{ return null; } } play(deviceId:string,desiredResolution?:Resolution) { stop(); // close before play var constraints = {}; if (!!deviceId){ constraints = { video: {deviceId: deviceId}, audio: false } }else{ constraints = { video: true, audio: false } } if (desiredResolution) { constraints["video"]["width"] = desiredResolution.width; constraints["video"]["height"] = desiredResolution.height; } let pThis = this; navigator.mediaDevices.getUserMedia(constraints).then(function(stream) { pThis.localStream = stream; // Attach local stream to video element pThis.camera.srcObject = stream; }).catch(function(err) { console.error('getUserMediaError', err, err.stack); alert(err.message); }); } stop () { try{ if (this.localStream){ this.localStream.getTracks().forEach(track => track.stop()); } } catch (e){ alert(e.message); } };
-
In the
disconnectedCallback
, stop the camera.disconnectedCallback() { this.stop(); }
-
Update
index.html
to have a test.<body> <camera-preview> </camera-preview> <script> const cameraElement = document.querySelector('camera-preview'); cameraElement.desiredCamera = "back"; // use back-facing camera cameraElement.desiredResolution = {width:1280,height:720}; </script> </body>
Add an active Prop to Control the Camera
We can add an active
prop to control the camera.
@Prop() active?: boolean;
When the component is loaded and the active
prop is not false
, open the camera.
componentDidLoad(){
if (this.active != false) {
this.playWithDesired();
}
}
We can also watch the prop. If it is set to true
, open the camera, otherwise, close the camera.
@Watch('active')
watchPropHandler(newValue: boolean) {
if (newValue === true) {
this.playWithDesired();
}else{
this.stop();
}
}
Add Support for Frame Analysing like Barcode Scanning
-
Add a public method to get the inner video element so that we can access the frame data.
@Method() async getVideoElement() { return this.camera; }
-
Add an
opened
event so that we can know when the camera is opened.@Event() opened?: EventEmitter<void>; onCameraOpened() { if (this.opened) { this.opened.emit(); } }
-
Draw overlays with SVG. The overlay highlights the position and content of frame analysing results like the barcodes. We can enable it using the
drawOverlay
prop.@Prop() drawOverlay?: boolean; renderSVGOverlay(){ if (this.drawOverlay === true && this.analysingResults) { return ( <svg preserveAspectRatio="xMidYMid slice" viewBox={this.viewBox} xmlns="<http://www.w3.org/2000/svg>" class="overlay full"> {this.analysingResults.map((result,idx) => ( <polygon key={"poly-"+idx} xmlns="<http://www.w3.org/2000/svg>" points={this.getPointsData(result)} class="polygon" /> ))} {this.analysingResults.map((result,idx) => ( <text key={"text-"+idx} xmlns="<http://www.w3.org/2000/svg>" x={result.localizationResult[0].x} y={result.localizationResult[0].y} fill="red" font-size="20" >{result.text}</text> ))} </svg> ) } } getPointsData = (result:AnalysingResult) => { let pointsData = ""; for (let index = 0; index < result.localizationResult.length; index++) { const point = result.localizationResult[index]; pointsData = pointsData + point.x + "," + point.y + " " } pointsData = pointsData.trim(); return pointsData; }
The definition of
AnalysingResult
:export interface AnalysingResult{ localizationResult: Point2D[]; text: string; rawData?: any; } export interface Point2D{ x: number; y: number; }
In the
onCameraOpened
event, update theviewBox
based on the current resolution of the camera preview.onCameraOpened() { if (this.opened) { this.opened.emit(); } this.updateViewBox(); } updateViewBox(){ this.viewBox = "0 0 "+this.camera.videoWidth+" "+this.camera.videoHeight; }
We’ve now finished building the component. We can run the following to build the project.
npm run build
Use the Web Component in Vanilla JS or Web Frameworks
The camera preview web component can work with or without Web frameworks.
Use the Web Component in Vanilla
In the HTML file, include the ES modules of the component and add the camera-preview
component.
<html>
<head>
<script type="module">
import { defineCustomElements } from './dist/esm/loader.js';
defineCustomElements();
</script>
</head>
<body>
<camera-preview>
</camera-preview>
</body>
</html>
We can modify its props and add event listeners in JavaScript.
const cameraElement = document.querySelector('camera-preview');
const onOpened = () => {
console.log("opened");
}
cameraElement.addEventListener("opened",onOpened);
cameraElement.drawOverlay = true;
cameraElement.desiredCamera = "founder";
cameraElement.active = false;
cameraElement.desiredResolution = {width:1280,height:720};
Use the Web Component in Angular
-
Open
src\index.html
, add the following inhead
so that we can use thecamera-preview
component.<script type="module"> import { defineCustomElements } from './dist/esm/loader.js'; defineCustomElements(); </script>
-
In
app.modules.ts
, add theCUSTOM_ELEMENTS_SCHEMA
schema so that we can use Web Components in an Angular project.import { CUSTOM_ELEMENTS_SCHEMA } from '@angular/core'; @NgModule({ declarations: [ AppComponent ], imports: [ BrowserModule, ], providers: [], schemas: [ CUSTOM_ELEMENTS_SCHEMA ], bootstrap: [AppComponent] })
-
In
app.component.html
, we can use thecamera-preview
like this:<camera-preview ngModel #cameraElement desired-camera="back" draw-overlay="true" (opened)="onOpened()" [active]="active" > </camera-preview>
-
In
app.component.ts
, define a cameraElement property to have a reference of the element.@ViewChild('cameraElement') cameraElement:any;
Use the Web Component in Vue
-
Open
index.html
, add the following inhead
so that we can use thecamera-preview
component.<script type="module"> import { defineCustomElements } from './dist/esm/loader.js'; defineCustomElements(); </script>
-
In the template, we can use the component like this:
<camera-preview ref="cameraElement" desired-camera="back" draw-overlay="true" v-on:opened="onOpened" :active="active" > </camera-preview>
-
We also need to add a
compilerOptions
invite.config.ts
to treat elements with a dash as custom elements.export default defineConfig({ plugins: [vue({ template: { compilerOptions: { // treat all tags with a dash as custom elements isCustomElement: (tag) => tag.includes('-') } } })], resolve: { alias: { '@': fileURLToPath(new URL('./src', import.meta.url)) } } })
Use the Web Component in React
-
Open
index.html
, add the following inhead
so that we can use thecamera-preview
component.<script type="module"> import { defineCustomElements } from './dist/esm/loader.js'; defineCustomElements(); </script>
-
In the JSX, we can use the component like this:
<camera-preview ref={cameraElement} active={active} desired-camera="back" draw-overlay={true} > </camera-preview>
-
Events emitted by a Web Component may not properly propagate through a React render tree. We need to manually attach event handlers to handle these events. 3
For example, run the following to add the event listener for the
opened
event.cameraElement.current.addEventListener("opened",onOpened);
A Barcode Scanner Demo in Vue
Next, we are going to write a barcode scanner demo using Vue, with Dynamsoft Barcode Reader to read barcodes from the camera preview.
-
New project.
npm init vue@latest
Then include the
camera-preview
component according to the previous part and add the following style:camera-preview { height:100%; width:100%; position: absolute; right:0; top:0; }
-
Install Dynamsoft Barcode Reader.
npm install dynamsoft-javascript-barcode
-
Open
App.vue
, initialize Dynamsoft Barcode Reader inonMounted
.let reader:BarcodeReader; onMounted(async () => { BarcodeReader.license = "DLS2eyJoYW5kc2hha2VDb2RlIjoiMjAwMDAxLTE2NDk4Mjk3OTI2MzUiLCJvcmdhbml6YXRpb25JRCI6IjIwMDAwMSIsInNlc3Npb25QYXNzd29yZCI6IndTcGR6Vm05WDJrcEQ5YUoifQ=="; reader = await BarcodeReader.createInstance(); })
You can apply for a trial license here.
-
Add a start camera button, a stop camera button and a continuous scan checkbox.
<template> <div :style="{display: active ? 'none' : '' }"> <button v-on:click="startCamera">Start Camera</button> <label> <input type="checkbox" v-model="continuous" /> Continuous Scan </label> </div> <camera-preview ref="cameraElement" desired-camera="back" draw-overlay="true" v-on:opened="onOpened" :active="active" :style="{display: active ? '' : 'none' }" > <button id="close-btn" v-on:click="stopCamera">Close</button> </camera-preview> </template>
When users click the start camera button, start the camera and reveal the camera preview component. When a barcode is found with continuous scan disabled or users click the stop camera button, stop the camera and hide the camera preview component.
const startCamera = () => { setActive(true); } const stopCamera = () => { stopDecoding(); // stop reading barcodes setActive(false); }
-
In the
opened
event, start decoding if the camera is opened.let scanned = false; let decoding = false; let interval:any = null; const onOpened = () => { startDecoding(); } const startDecoding = () => { scanned = false; stopDecoding(); interval = setInterval(decode, 40); } const stopDecoding = () => { clearInterval(interval); }
The
decode
function:const decode = async () => { if (decoding === false && reader && cameraElement.value) { // if the previous decoding is completed, the reader and the camera element are initialized const video = await cameraElement.value.getVideoElement(); decoding = true; try { let results = await reader.decode(video); if (results.length>0 && scanned === false && continuous.value === false) { scanned = true; stopDecoding(); stopCamera(); alert(results[0].barcodeText); }else{ await cameraElement.value.updateAnalysingResults(wrapResults(results)); } } catch (error) { console.log(error); } decoding = false; } }
The barcode results are wrapped as
AnalysingResults
to draw the overlay.const wrapResults = (results:TextResult[]) => { let analysingResults = []; for (let index = 0; index < results.length; index++) { const result = results[index]; let analysingResult:any = {}; analysingResult.text = result.barcodeText; analysingResult.localizationResult = [ {x:result.localizationResult.x1,y:result.localizationResult.y1}, {x:result.localizationResult.x2,y:result.localizationResult.y2}, {x:result.localizationResult.x3,y:result.localizationResult.y3}, {x:result.localizationResult.x4,y:result.localizationResult.y4} ]; analysingResults.push(analysingResult) } return analysingResults; }
The demo is now completed. You can try out the online demo.
Source Code
The Camera Preview Component:
https://github.com/xulihang/camera-preview-component
Demos:
https://github.com/xulihang/camera-preview-demo