49

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.

What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:

  • Apple's HTTP Live Streaming (HLS)
  • Adobe's RTMP

Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.

So my questions are:

  • Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
  • If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.

I thank you very much in advance dear friends.

Mehdi.

Medi The Jedi
  • 509
  • 1
  • 5
  • 8
  • It might be a lot more than what you need, but [webrtc](http://www.webrtc.org/) can do this (it's actually for cross-platform video calling without any plugins). It takes some time to set everything up, but if you want to expand your functionality later on this could be a good solution. – Kevin Nov 11 '14 at 08:05

1 Answers1

29

I have developed such a library, and you can find it at github.com/jgh-/VideoCore

I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.

Additionally, VideoCore is now available in CocoaPods.

jgh
  • 2,017
  • 14
  • 13
  • Hi. Thank you for this library. Interesting work. Any plans to wrap it in Cocoa Touch/Objective-C library? – Medi The Jedi Mar 12 '14 at 10:33
  • @MediTheJedi I have added a sample iOS project to VideoCore to demonstrate streaming to an RTMP server with the camera and microphone. Find it at https://github.com/jamesghurley/VideoCore/tree/master/sample – jgh May 07 '14 at 16:24
  • Is there any URL where we can see the posted videos from iOS devices. – Sundeep Saluja May 26 '14 at 09:41
  • You need to have an RTMP server available. Note that this can also connect to services such as Justin.tv where you are given a stream key and ingest server url. – jgh May 27 '14 at 01:23
  • 1
    I am getting error in VideoCore library that file not found while all files are included in project. i have checked the path also.Please help – Sanjay Kumar Yadav May 29 '14 at 14:00
  • Please email me using the address found in the readme or open an issue. Be sure to include your errors and header search paths in the issue or email – jgh May 30 '14 at 02:52
  • This library give throwing this error: #include file not found . – Sanjay Kumar Yadav May 30 '14 at 07:05
  • 2
    Hi Jamesghurley, I have gone through your VideoCore sample iOS project. Can you give me some inputs on what we have to do for HTTP steaming instead of RTMP. Which approach is better like opening a websocked and sending what ever we get after encoding or transfer MPEG-2 stream, how can we built a MPEG-2 stream from encoded cmsamplebuffers? – Shiva Reddy Jun 03 '15 at 07:06
  • @jgh is it possible to stream with your lib to youtube? – MR.GEWA Jul 31 '15 at 04:50
  • It works! But sample only for output stream from camera, does this library play rtmp stream from server? – m8labs Aug 23 '15 at 21:17
  • Does this swift code will broadcast live video to my RTMP server from my iPhone? – Saty Oct 27 '15 at 10:32
  • Hi @jgh.. i tried the sample app. I have included my RTMP server address with key, its giving me the below error ClientState: 2 ClientState: 3 ClientState: 4 ClientState: 5 ClientState: 6 received server window size: 2500000 received peer bandwidth limit: 2500000 type: 2 Request to change incoming chunk size from 128 -> 4096 Received invoke pktId: 1 received invoke _result tracked command: connect ClientState: 7 Received invoke pktId: 0 received invoke onBWDone ClientState: 11 – Saty Dec 02 '15 at 12:52
  • @ Sanjay Kumar Yadav Yes you are right. I noticed the same issue – Abdul Yasin Aug 05 '16 at 12:11
  • 1
    Can anyone share some sample code for live video streaming in Swift pls? – Catherine Dec 26 '17 at 09:39