unity中播放录像,ES来播放摄像

unity中播放录制步骤如下:

unity五.陆开头扩充了videoPlayer,使得录制播放相对相比简单,项目需求开始展览了一下斟酌接纳,也境遇许多坑,谷歌(Google)百度时而意识真正有那几个难题,1些简约难题如下:

在讲代码达成从前,笔者先讲讲TextureView, SurfaceTexture,OpenGL
ES都以些什么鬼东西,作者又是怎么使用那多少个东西来浮现二个录制的。

按: 近来做了3个直播的预备性商讨项目,
因而记录下直播的技艺的落实,在那进度中部分难题消除的思路,以android平台的兑现认证。

一.快要播放的录制拖入projec。(注意:unity一般协理的录制格式有mov, .mpg,
.mpeg, .mp5,.avi, .asf格式  )

一)播放无声音

TextureViewunity中播放录像,ES来播放摄像。
顾名思义也正是三个继承了View的二个View控件而已,官网的解释是如此的:
A TextureView can be used to display a content stream. Such a content
stream can for instance be a video or an OpenGL scene. The content
stream can come from the application’s process as well as a remote
process.

它能够去展现二个剧情流,比如录像流,OpenGL渲染的场地等。这个流能够是本土程序进度也足以是长距离进度流,有点绕,笔者的明亮正是,比如既能够是本地录制流,也得以是网络摄像流。
小心的是: TextureView
选择的是硬件加快器去渲染,就类似录制的硬解码跟软解码,1个靠的是GPU解码,3个靠CPU解码。
那么哪些去采纳这一个TextureView呢?
OK,现在SurfaceTexture就要上场了,从那八个类的命名大家就知晓TextureView重点是View,而SurfaceTexture
重点是Texture它的官网解释:
Captures frames from an image stream as an OpenGL ES texture.The image
stream may come from either camera preview or video decode. \

也正是说它能捕获三个图像流的1帧来作为OpenGL
的texture也正是纹理。那么些图形流重若是源于相机的预览或摄像的解码。(笔者想以此特性是不应该能够用来做过多事了)。
到这儿,texture也有了,那么OpenGL\也就足以出去干活了,它可以绑定texture并将其在TextureView上1帧1帧的给绘制出来,就形成了大家所见到摄像图像了(实际有关SurfaceTexture、TextureView我们能够参考那里)
说了那样,是该来点代码来瞧瞧了,好的代码就跟读管法学随笔同等,那样的赏心悦目,并不是说作者写的代码很顺眼啦,那只是追求。。。

体系结构

贰.在万象中添加RawImage。(因为Image使用sprite渲染,rawImage是用texture渲染)

二)通过slider控制作和播出放进程

代码

先从MainActicity主类起先:

public class MainActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener,
        MediaPlayer.OnPreparedListener{
    /**本地视频的路径*/
    public String videoPath = Environment.getExternalStorageDirectory().getPath()+"/aoa.mkv";
    private TextureView textureView;
    private MediaPlayer mediaPlayer;
    /**
    * 视频绘制前的配置就发生在这个对象所在类中.
    * 真正的绘制工作则在它的子类中VideoTextureSurfaceRenderer
    */
    private TextureSurfaceRenderer videoRenderer;
    private int surfaceWidth;
    private int surfaceHeight;
    private Surface surface;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        textureView = (TextureView) findViewById(R.id.id_textureview);
        //注册一个SurfaceTexture,用于监听SurfaceTexure
        textureView.setSurfaceTextureListener(this);

    }
    /**
    * 播放视频的入口,当SurfaceTexure可得到时被调用
    */
    private void playVideo() {
        if (mediaPlayer == null) {
            videoRenderer = new VideoTextureSurfaceRenderer(this, textureView.getSurfaceTexture(), surfaceWidth, surfaceHeight);
            surface = new Surface(videoRenderer.getSurfaceTexture());
            initMediaPlayer();
        }
    }

    private void initMediaPlayer() {
        this.mediaPlayer = new MediaPlayer();
        try {
            mediaPlayer.setDataSource(videoPath);
            mediaPlayer.setSurface(surface);
            mediaPlayer.prepareAsync();
            mediaPlayer.setOnPreparedListener(this);
            mediaPlayer.setLooping(true);
        } catch (IllegalArgumentException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (SecurityException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IllegalStateException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IOException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        }
    }
    @Override
    public void onPrepared(MediaPlayer mp) {
        try {
            if (mp != null) {
                mp.start(); //视频开始播放了
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        }
    }


    @Override
    protected void onResume() {
        super.onResume();
        if (textureView.isAvailable()) {
            playVideo();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if (videoRenderer != null) {
            videoRenderer.onPause();  //记得去停止视频的绘制线程
        }
        if (mediaPlayer != null) {
            mediaPlayer.release();
            mediaPlayer =null;
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        surfaceWidth = width;
        surfaceHeight = height;
        playVideo();
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }

}

那正是程序的入口类,关于Mediaplayer是怎么播放时录制源的,小编就在此就不说了,那其间其实还有很多东西的,大家能够活动的查检。有一点小编急需说说正是,一般MediaPlayer.setSurface(param)在那之中的参数param都以SurfaceView.SurfaceHolder,而笔者此时平素用的是Surface
(有关Surface能够参照那里),小编那么些录制播放与别的的录制播放的区分就在此。那篇先暂且写在此时啦,后续主旨的绘图工作,就前边有空就再写了。上边写的假如有如何难点期待大家能多多指点,多谢不尽!
下一篇已写好TextureView+SurfaceTexture+OpenGL
ES来播放录像(二)

  • unity纹理插件和摄像采访(摄像源)
    VideoSourceCamera
  • 迈克风韵集(音频源)
    AudioSourceMIC
  • 摄像编码
    VideoEncoder
  • 节奏编码
    AudioEncoder
  • FLV编码(混合)
    MuxerFLV
  • http流上传(上传源)
    PublisherHttp
  • 流录制播放(重播)
    play
  • OpenGL图形图象处理

3.rawImage下添加videoPlayer组件,将摄像赋给videoplayer,将其拖到video
clip上。

三)录制截图(texture->texture2d)

从本篇文章早先将会介绍这多少个零部件的落到实处细节,相互依赖关系的处理方式。

四.创立脚本PlayVodeoOnUGUI,宗旨代码:rawImage.texture =
videoPlayer.texture,即将video的tuxture赋值给rawImage就能收看要播放的录像了

四)录制截至时事件激活

(一) —— unity纹理插件

大家的直播项目服务于unity,而unity是一个跨平台的18日游引擎,底层依照差异平台,选取了directx,
opengl, opengles, 因而需求贯彻差别平台的图样插件。
(unity的图形插件文书档案)
https://docs.unity3d.com/Manual/NativePluginInterface.html
在anroid平台下的直播,unity图形插件成效至关心重视借使渲染线程通告,
因为无论是录制采访,成立苹果平板,
图像处理(shader),依旧编码录制纹理传入,都亟需工作在unity的渲染线程下,

  • unity创设纹理,将纹理ID传递到直播插件。

  • 打开camera设备,准备好采访华为平板,
    mCameraGLTexture =
    new GLTexture(width, height, GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
    GLES20.GL_RGBA);
    note: camera
    GALAXY Tab是一种独特类别的纹路,通过GLES11Ext.GL_TEXTURE_EXTERNAL_OES参数创建

  • 回调文告每1帧数据准备完结
    public void onFrameAvailable(final SurfaceTexture surfaceTexture)
    {
    //这里将采集线程的图象push到渲染线程处理
    getProcessor().append (new Task() {
    @Override
    public void run() {
    surfaceTexture.updateTexImage();
    }
    });
    }

    camera 华为平板也亟需做尤其纹理评释

      #extension GL_OES_EGL_image_external : require
      precision mediump float;
      uniform samplerExternalOES uTexture0;
      varying vec2 texCoordinate;
      void main(){
          gl_FragColor = texture2D(uTexture0, texCoordinate);
      }
    
  • 将camera 苹果平板纹理写入到 unity的纹理,
    美高梅开户网址,将一张纹理写入到另壹纹理,能够二种方法,

    • 因此glReadPixels, 但那样会导致巨大的内部存款和储蓄器拷贝,CPU压力。

    • 渲染到纹理(render to texture)
      mTextureCanvas = new
      GLRenderTexture(mGLTexture);//声明rendertexture

        void renderCamera2Texture()
        {
            mTextureCanvas.begin();
            cameraDrawObject.draw();
            mTextureCanvas.end();
        }
      

      GLRenderTexture的实现, 如下
      GLRenderTexture(GLTexture tex)
      {
      mTex = tex;
      int fboTex = tex.getTextureID();
      GLES20.glGenFramebuffers(1, bufferObjects, 0);
      GLHelper.checkGlError(“glGenFramebuffers”);
      fobID = bufferObjects[0];

            //创建render buffer
            GLES20.glGenRenderbuffers(1, bufferObjects, 0);
            renderBufferId = bufferObjects[0];
            //绑定Frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            //Bind render buffer and define buffer dimension
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, tex.getWidth(), tex.getHeight());
            GLHelper.checkGlError("glRenderbufferStorage");
            //设置为framebuffer为texutre类型
            GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, fboTex, 0);
            GLHelper.checkGlError("glFramebufferTexture2D");
            //设置depthbuffer
            GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glFramebufferRenderbuffer");
            //we are done, reset
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            GLHelper.checkGlError("glBindFramebuffer");
        }
      
        void begin()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            GLES20.glViewport(0, 0, mTex.getWidth(), mTex.getHeight());
            GLHelper.checkGlError("glViewport");
        }
      
        void end()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        }
      
  • 美颜
    经过shader完成实时的美颜效用,美白,磨皮
    (美颜成效的法则可参照)
    http://meituplus.com/?p=101
    (更多的实时shader处理可参照)
    https://github.com/wuhaoyu1990/MagicCamera

基本上无太大标题,以上八个难点解决方案在下文浅士林蓝文字区域,先介绍一下video
Player应用,后续对那三个难题开始展览缓解。

 

(壹)新建video Player 可以在ui下田间video
Play组建,也得以一向右键-video-videoplayer,添加后得以见见如下图所示的机件

美高梅开户网址 1

本文重要重点说一下一下参数:source有三种情势clip形式和url格局,clip则足以平素通过videoClip实行播报,url则能够通过url举行广播。renderMode为渲染形式,既能够为camera,material等,假设是使用ui播放的选拔render
texture,本文选取此方式。audioOutputMode有二种,none方式,direct方式(没尝试)和audiosource方式,本文采取audiosource格局,采取此方式时只需求将audiosource组建拖入上航海用教室中videoPlayer中的audiosource参数槽中即可,不需求别的处理,但偶尔会油可是生拖入后videoPlayer中的audiosource参数槽消失,且无声音播放,所以一般采纳代码添加,如下所示:

 

      //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

 

(二)录制播放的决定与节奏/动画播放类似,videoPlayer有play/pause等措施,具体可以参见前面完整代码。

         
在调用摄像播放达成时事件loopPointReached(此处为借鉴别人称作,此事件其实并不是摄像播放完结时的风浪),顾名思义,此事件为达到规定的标准录像播放循环点时的轩然大波,即当videoplay
的isLooping属性为true(即循环播放摄像)时,录制甘休时调用此方法,所以当摄像非循环播放时,此事件在录像结束时调用不到。要想调用此情势能够把录像安装为循环播放,在loopPointReached内定的事件中停播摄像

(三)关于录制播放的ui接纳难题,选取render texture时须要内定target
texture。

      
一)在project面板上create-renderTexture,并把新建的renderTexture拖到videoplayer相应的参数槽上

      
2)在Hierarchy面板上新建ui-RawImage,并把上一步新建的renderTexture拖到RawImage的texture上即可。

      
其实能够不用那样处理,videoPlayer有texture变量,间接在update里面把texture值赋给RawImage的texture即可,代码如下

rawImage.texture = videoPlayer.texture;

      录像截图时能够通过videoPlayer.texture,把图像保存下去不过需求把texture转变为texture2d,固然后者继续在前端,不过不能强制转货回去,转换以及存款和储蓄图片代码如下:

   private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }

 

末段说一下由此slider控制录像播放进程的难题,

通过slider控制录制播放存在多少个难题,一方面在update实时把videoPlayer.time
赋值给slider,壹方面必要把slider的value反馈给time,若是用slider的OnValueChanged(float
value)
方法则存在争持,导致难点。所以可以经过UI事件的BeginDrag和EndDrag事件

事件开展,即当BeginDrag时,截止给slider赋值,当EndDrag时再也开首赋值。如下图所示

美高梅开户网址 2

 全代码

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.Video;

public class VideoController : MonoBehaviour {
    public GameObject screen;
    public Text videoLength;
    public Text currentLength;
    public Slider volumeSlider;
    public Slider videoSlider;

    private string video1Url;
    private string video2Url;
    private VideoPlayer videoPlayer;
    private AudioSource audioSource;
    private RawImage videoScreen;
    private float lastCountTime = 0;
    private float totalPlayTime = 0;
    private float totalVideoLength = 0;

    private bool b_firstVideo = true;
    private bool b_adjustVideo = false;
    private bool b_skip = false;
    private bool b_capture = false;

    private string imageDir =@"D:\test\Test\bwadmRe";

    // Use this for initialization
    void Start () {
        videoScreen = screen.GetComponent<RawImage>();
        string dir = Path.Combine(Application.streamingAssetsPath,"Test");
        video1Url = Path.Combine(dir, "01.mp4");
        video2Url = Path.Combine(dir, "02.mp4");

        //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.SetTargetAudioSource(0, audioSource);

        VideoInfoInit(video1Url);
        videoPlayer.loopPointReached += OnFinish;
    }

    #region private method
    private void VideoInfoInit(string url)
    {
        videoPlayer.source = VideoSource.Url;
        videoPlayer.url = url;        

        videoPlayer.prepareCompleted += OnPrepared;
        videoPlayer.isLooping = true;

        videoPlayer.Prepare();
    }

    private void OnPrepared(VideoPlayer player)
    {
        player.Play();
        totalVideoLength = videoPlayer.frameCount / videoPlayer.frameRate;
        videoSlider.maxValue = totalVideoLength;
        videoLength.text = FloatToTime(totalVideoLength);

        lastCountTime = 0;
        totalPlayTime = 0;
    }

    private string FloatToTime(float time)
    {
        int hour = (int)time / 3600;
        int min = (int)(time - hour * 3600) / 60;
        int sec = (int)(int)(time - hour * 3600) % 60;
        string text = string.Format("{0:D2}:{1:D2}:{2:D2}", hour, min, sec);
        return text;
    }

    private IEnumerator PlayTime(int count)
    {
        for(int i=0;i<count;i++)
        {
            yield return null;
        }
        videoSlider.value = (float)videoPlayer.time;
        //videoSlider.value = videoSlider.maxValue * (time / totalVideoLength);
    }

    private void OnFinish(VideoPlayer player)
    {
        Debug.Log("finished");        
    }

    private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }
    #endregion

    #region public method
    //开始
    public void OnStart()
    {
        videoPlayer.Play();
    }
    //暂停
    public void OnPause()
    {
        videoPlayer.Pause();
    }
    //下一个
    public void OnNext()
    {
        string nextUrl = b_firstVideo ? video2Url : video1Url;
        b_firstVideo = !b_firstVideo;

        videoSlider.value = 0;
        VideoInfoInit(nextUrl);
    }
    //音量控制
    public void OnVolumeChanged(float value)
    {
        audioSource.volume = value;
    }
    //视频控制
    public void OnVideoChanged(float value)
    {
        //videoPlayer.time = value;
        //print(value);
        //print(value);
    }
    public void OnPointerDown()
    {
        b_adjustVideo = true;
        b_skip = true;
        videoPlayer.Pause();
        //OnVideoChanged();
        //print("down");
    }
    public void OnPointerUp()
    {
        videoPlayer.time = videoSlider.value;

        videoPlayer.Play();
        b_adjustVideo = false;  
        //print("up");
    }
    public void OnCapture()
    {
        b_capture = true;
    }
    #endregion

    // Update is called once per frame
    void Update () {
        if (videoPlayer.isPlaying)
        {            
            videoScreen.texture = videoPlayer.texture;
            float time = (float)videoPlayer.time;
            currentLength.text = FloatToTime(time);

            if(b_capture)
            {
                string name = DateTime.Now.Minute.ToString() + "_" + DateTime.Now.Second.ToString() + ".png";
                SaveRenderTextureToPNG(videoPlayer.texture,Path.Combine(imageDir,name));                
                b_capture = false;
            }

            if(!b_adjustVideo)
            {
                totalPlayTime += Time.deltaTime;
                if (!b_skip)
                {
                    videoSlider.value = (float)videoPlayer.time;
                    lastCountTime = totalPlayTime;
                }                
                if (totalPlayTime - lastCountTime >= 0.8f)
                {
                    b_skip = false;
                }
            }
            //StartCoroutine(PlayTime(15));   

        }
    }
}

 

 

 

比方利用插件AVPro Video全数毛病小难题

发表评论

电子邮件地址不会被公开。 必填项已用*标注

网站地图xml地图