HaishinKit for Android, iOS, macOS and tvOS.
- Camera and Microphone streaming library via RTMP for Android.
- API Documentation
- If you need help with making LiveStreaming requests using HaishinKit, use a GitHub Discussions with Q&A.
- If you'd like to discuss a feature request, use a GitHub Discussions with Idea
- If you met a HaishinKit's bug🐛, use a GitHub Issue with Bug report template
- If you don't use an issue template. I will immediately close the your issue without a comment.
- If you want to contribute, submit a pull request!
- If you want to support e-mail based communication without GitHub.
- Consulting fee is $50/1 incident. I'm able to response a few days.
- Discord chatroom.
- 日本語が分かる方は日本語でお願いします!
Project name | Notes | License |
---|---|---|
HaishinKit for iOS, macOS and tvOS. | Camera and Microphone streaming library via RTMP for Android. | BSD 3-Clause "New" or "Revised" License |
HaishinKit for Flutter. | Camera and Microphone streaming library via RTMP for Flutter. | BSD 3-Clause "New" or "Revised" License |
- Authentication
- Publish
- Playback
- Action Message Format
- AMF0
-
AMF3
-
SharedObject - RTMPS
- Native (RTMP over SSL/TSL)
- Enhanced RTMP (Working in progress)
- v1
- v2
- Audio Codecs
- AAC
- Video Codecs
- H264, HEVC
Now support local recording. Additionally, you can specify separate videoSettings and audioSettings from the live stream.
val recorder: StreamRecorder by lazy { StreamRecorder(requireContext()) }
recorder.videoSettings.profileLevel = VideoCodecProfileLevel.HEVC_MAIN_3_1
recorder.attachStream(stream)
recorder.startRecording(
File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS), "output.mp4").toString(),
MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4
)
- Single camera with Camera2 api
- Multi camera with Camera2 api
- MediaProjection
- Microphone with AudioRecord api.
- | HkSurfaceView | HkTextureView |
---|---|---|
Engine | SurfaceView | TextureView |
Playback | beta | beta |
Publish | ✅ Stable | ✅ Stable |
Note | Recommend Android 7.0+ | Recommend Android 5.0-6.0 |
- Hardware acceleration for H264 video encoding/AAC audio encoding.
- Asynchronously processing.
- Graphics api
- ✅ OpenGL
- 🐛 Vulkan
stream.audioSettings.bitrate = 32 * 1000
stream.videoSettings.width = 640 // The width resoulution of video output.
stream.videoSettings.height = 360 // The height resoulution of video output.
stream.videoSettings.bitrate = 160 * 1000 // The bitRate of video output.
stream.videoSettings.IFrameInterval = 2 // The key-frmae interval
Through off-screen rendering capabilities, it is possible to display any text or bitmap on a video during broadcasting or viewing. This allows for various applications such as watermarking and time display.
stream.attachVideo(cameraSource)
val text = Text()
text.textSize = 60f
text.textValue = "23:44:56"
text.layoutMargins.set(0, 0, 16, 16)
text.horizontalAlignment = ScreenObject.HORIZONTAL_ALIGNMENT_RIGHT
text.verticalAlignment = ScreenObject.VERTICAL_ALIGNMENT_BOTTOM
stream.screen.addChild(text)
val image = Image()
image.bitmap = BitmapFactory.decodeResource(resources, R.drawable.game_jikkyou)
image.verticalAlignment = ScreenObject.VERTICAL_ALIGNMENT_BOTTOM
image.frame.set(0, 0, 180, 180)
stream.screen.addChild(image)
Examples project are available for Android.
- Camera and microphone publish.
- RTMP Playback
git clone https://github.com/shogo4405/HaishinKit.kt.git
cd HaishinKit.kt
git submodule update --init
# Open [Android Studio] -> [Open] ...
JitPack
- A common mistake is trying to use implementation 'com.github.shogo4405.HaishinKit.kt', which does not work. The correct form is implementation 'com.github.shogo4405.HaishinKit~kt'.
- In older versions, there may be cases where Jetpack is not supported. If it's not available, please give up and use the latest version.
allprojects {
repositories {
maven { url 'https://jitpack.io' }
}
}
dependencies {
implementation 'com.github.shogo4405.HaishinKit~kt:haishinkit:x.x.x'
implementation 'com.github.shogo4405.HaishinKit~kt:compose:x.x.x'
implementation 'com.github.shogo4405.HaishinKit~kt:lottie:x.x.x'
implementation 'com.github.shogo4405.HaishinKit~kt:vulkan:x.x.x'
}
- | minSdk | Android | Requirements | Status | Description |
---|---|---|---|---|---|
haishinkit | 21+ | 5 | Require | Stable | It's the base module for HaishinKit. |
compose | 21+ | 5 | Optional | Beta | It's support for a composable component for HaishinKit. |
lottie | 21+ | 5 | Optional | Beta | It's a module for embedding Lottie animations into live streaming video. |
vulkan | 26+ | 8 | Optional | Technical preview | It's support for the Vulkan graphics engine. |
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
ActivityCompat.requestPermissions(this,arrayOf(
Manifest.permission.CAMERA,
Manifest.permission.RECORD_AUDIO
), 1)
Real Time Messaging Protocol (RTMP).
class CameraTabFragment : Fragment(), IEventListener {
private lateinit var connection: RtmpConnection
private lateinit var stream: RtmpStream
private lateinit var cameraView: HkGLSurfaceView
private lateinit var cameraSource: CameraSource
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
activity?.let {
val permissionCheck = ContextCompat.checkSelfPermission(it, Manifest.permission.CAMERA)
if (permissionCheck != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(it, arrayOf(Manifest.permission.CAMERA), 1)
}
if (ContextCompat.checkSelfPermission(
it,
Manifest.permission.RECORD_AUDIO
) != PackageManager.PERMISSION_GRANTED
) {
ActivityCompat.requestPermissions(it, arrayOf(Manifest.permission.RECORD_AUDIO), 1)
}
}
connection = RtmpConnection()
stream = RtmpStream(connection)
stream.attachAudio(AudioRecordSource())
cameraSource = CameraSource(requireContext()).apply {
open(CameraCharacteristics.LENS_FACING_BACK)
}
stream.attachVideo(cameraSource)
connection.addEventListener(Event.RTMP_STATUS, this)
}
@SuppressLint("SetTextI18n")
override fun onCreateView(
inflater: LayoutInflater,
container: ViewGroup?,
savedInstanceState: Bundle?
): View {
val v = inflater.inflate(R.layout.fragment_camera, container, false)
val button = v.findViewById<Button>(R.id.button)
button.setOnClickListener {
if (button.text == "Publish") {
connection.connect(Preference.shared.rtmpURL)
button.text = "Stop"
} else {
connection.close()
button.text = "Publish"
}
}
val switchButton = v.findViewById<Button>(R.id.switch_button)
switchButton.setOnClickListener {
cameraSource.switchCamera()
}
cameraView = v.findViewById(R.id.camera)
cameraView.attachStream(stream)
return v
}
override fun onDestroy() {
super.onDestroy()
connection.dispose()
}
override fun handleEvent(event: Event) {
Log.i("$TAG#handleEvent", event.toString())
val data = EventUtils.toMap(event)
val code = data["code"].toString()
if (code == RtmpConnection.Code.CONNECT_SUCCESS.rawValue) {
stream.publish(Preference.shared.streamName)
}
}
companion object {
fun newInstance(): CameraTabFragment {
return CameraTabFragment()
}
private val TAG = CameraTabFragment::class.java.simpleName
}
}
- [assets]
- [shaders]
- custom-shader.vert(optional)
- custom-shader.frag
package my.custom.filter
import com.haishinkit.graphics.filter.VideoEffect
class Monochrome2VideoEffect(
override val name: String = "custom-shader"
) : VideoEffect
stream.videoEffect = Monochrome2VideoEffect()
<uses-sdk tools:overrideLibrary="com.haishinkit.vulkan" />
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
PixelTransformFactory.registerPixelTransform(VkPixelTransform::class)
}
- rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
- [] mark is an Optional.
rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]") rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
- rtmp://localhost/live/streamName
rtmpConneciton.connect("rtmp://localhost/live") rtmpStream.publish("streamName")
- HaishinKit.swift - Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS.
BSD-3-Clause