-
-
Save deandob/9240090 to your computer and use it in GitHub Desktop.
// Live video stream management for HTML5 video. Uses FFMPEG to connect to H.264 camera stream, | |
// Camera stream is remuxed to a MP4 stream for HTML5 video compatibility and segments are recorded for later playback | |
var liveStream = function (req, resp) { // handle each client request by instantiating a new FFMPEG instance | |
// For live streaming, create a fragmented MP4 file with empty moov (no seeking possible). | |
var reqUrl = url.parse(req.url, true) | |
var cameraName = typeof reqUrl.pathname === "string" ? reqUrl.pathname.substring(1) : undefined; | |
if (cameraName) { | |
try { | |
cameraName = decodeURIComponent(cameraName); | |
} catch (exception) { | |
console.log("Live Camera Streamer bad request received - " + reqUrl); // Can throw URI malformed exception. | |
return false; | |
} | |
} else { | |
console.log("Live Camera Streamer - incorrect camera requested " + cameraName); // Can throw URI malformed exception. | |
return false; | |
} | |
console.log("Client connection made to live Camera Streamer requesting camera: " + cameraName) | |
resp.writeHead(200, { | |
//'Transfer-Encoding': 'binary' | |
"Connection": "keep-alive" | |
, "Content-Type": "video/mp4" | |
//, 'Content-Length': chunksize // ends after all bytes delivered | |
, "Accept-Ranges": "bytes" // Helps Chrome | |
}); | |
for (var cam in cameras) { | |
if (cameraName.toLowerCase() === cameras[cam].name.toLowerCase()) { | |
if (!cameras[cam].liveStarted) { | |
cameras[cam].liveffmpeg = child_process.spawn("ffmpeg", [ | |
"-rtsp_transport", "tcp", "-i", cameras[cam].rtsp, "-vcodec", "copy", "-f", "mp4", "-movflags", "frag_keyframe+empty_moov", | |
"-reset_timestamps", "1", "-vsync", "1","-flags", "global_header", "-bsf:v", "dump_extra", "-y", "-" // output to stdout | |
], {detached: false}); | |
cameras[cam].liveStarted = true; | |
cameras[cam].liveffmpeg.stdout.pipe(resp); | |
cameras[cam].liveffmpeg.stdout.on("data",function(data) { | |
}); | |
cameras[cam].liveffmpeg.stderr.on("data", function (data) { | |
console.log(cameras[cam].name + " -> " + data); | |
}); | |
cameras[cam].liveffmpeg.on("exit", function (code) { | |
console.log(cameras[cam].name + " live FFMPEG terminated with code " + code); | |
}); | |
cameras[cam].liveffmpeg.on("error", function (e) { | |
console.log(cameras[cam].name + " live FFMPEG system error: " + e); | |
}); | |
} | |
break; // Keep cam variable active with the selected cam number | |
} | |
} | |
if (cameras[cam].liveStarted === false) { | |
// Didn't select a camera | |
} | |
req.on("close", function () { | |
shutStream("closed") | |
}) | |
req.on("end", function () { | |
shutStream("ended") | |
}); | |
function shutStream(event) { | |
//TODO: Stream is only shut when the browser has exited, so switching screens in the client app does not kill the session | |
console.log("Live streaming connection to client has " + event) | |
if (typeof cameras[cam].liveffmpeg !== "undefined") { | |
cameras[cam].liveffmpeg.kill(); | |
cameras[cam].liveStarted = false; | |
} | |
} | |
return true | |
} |
Hi,it's cool,I'm a newer,can you supply a complete code source?
I am really interested in this. Would love to offer you a bounty for some help with application on my project. k i d m e t 4 8 @ gmail
Hi, sorry I have not been reading messages on github lately. This code was quite painful to get to work, especially the ffmpeg stuff but its been working well for a couple of months now. The complete source code is part of a larger application framework I'm developing so it may not be so applicable for reuse outside the framework, the relevant bits are included in the GIST above.
The GIST should be enough to get you going but I'd be happy to answer any specific questions you may have,
In a phonegap app, there are a few .js files that are made.
jquery.mobile-1.0a3.min
jquery-1.5.min
phonegap
Which file would you input this code into? And what is the best means of delivery for the actual video content in an .html file?
jquery is a library of javascript helper functions, and I'm not familiar with phonegap but I assume it is a library as well. You will need to find the main javascript file for your application, add the video tag to the HTML then update the script tags with code similar to the above. To get this working you will need intermediary web coding skills, so I'm assuming if you don't know about jquery this stuff is pretty new to you and it is likely going to be difficult to get it working.
video HTML tag.
video id="widget" width="100" height="100" style="width: 100px; height: 100px; position: absolute; left: 0px; top: 0px" poster="..\images\videoSecurity.jpg">
source id="vidSource" type="video/mp4">
/video>
Definitely new, learning as I stumble along. I'll take a stab at this as I think the phonegap.js file is the main javascript file. I can copy and paste the code just as you have it, correct?
Unlikely, you will likely need to modify some of it to suit the phonegap framework.
Yea, I am having no luck here. Do you have any suggestion as to how I figure out which file is my "main javascript" file?
video HTML tag.
video id="widget" width="100" height="100" style="width: 100px; height: 100px; position: absolute; left: 0px; top: 0px" poster="..\images\videoSecurity.jpg">
source id="vidSource" type="video/mp4">
/video>
OK, I have it setup I believe to where I need it. A couple quick questions. would the "vidSource" be the IP of the camera I am trying to stream? And Also, are some <'s missing? I have it as so in the html.
video HTML tag.
Does this actually work well for you? Because it absolutely does not for me. Firefox will not understand the output format at all, and in Chrome it takes about half a minute to even start streaming and when it does, the picture comes in in sped-up, broken bursts. Huge timing issues, absolutely unusable.
I'm sourcing from a webcam through VLC, at mere 320kbps (320x180px). There's no way it's a bandwidth or CPU problem, both the source and the server have plenty.
No luck for me either.
Noorus, I'd like to talk to you directly if possible to try to work out some kinks.
If I may add this point when choosing how to stream the live to a file. Firefox reads .webm but not mp4, meanwhile Chrome can read mp4; however its sister Chromium can read .webm as well.
Hi,
Just wondering if you have any full working demo?
Would be great help!
Thanks
The code should not work well, note the line for (var cam in cameras)
, but there is no place to define the cameras
array, thus the essentiel code in that loop won't work.
I used this code to create a working test harness, and it works with video.js in Chrome and FF, but IE11 will not play back the video. It downloads the beginning and then just quits out.
I uploaded my edits to the gist @ https://gist.github.com/dbussert/0cf0476b15ab45e0851d/6b3d6f049bd9449ac82f6e8f3df488620f637ddb
Here must be stack of technologies. Node-rtsp-stream via net module or gstreamer with his rtsp server, webrtc gateway, binary.js socket and so on.
HI
when I use safari ,the liveServer will be broken, but IE10、Firefox、chrome are no problem, I don't know why safari not support?
thanks~
For anyone looking to use this, removing the '-bsf:v dump_extra' option fixed the video for me for some reason.
You may use something like:
const express = require('express')
const url = require('url');
const child_process = require('child_process');
const app = express()
const port = 3601
const cameras = [ {name: "cam1", rtsp: "rtsp://10.0.0.121/axis-media/media.amp", liveStarted: false} ];
app.get('/', (req, res) => liveStream(req, res))
app.listen(port, () => console.log(Server up on port ${port}
))
In front of the code to use it with node.
Works for me, but video is jerky.
Hi, this looks quite interesting and I am interested in understanding your code better. Do you have a complete running example that I can look at?
Cheers
Sam