2012年3月17日 星期六

Android 4.0 Ice Cream Sandwich Media Framework (1)

Record Android 4.0 media framework tracing about Stagefright and NuPlayer

記錄一下關於Android 4.0中的多媒體框架運作流程



1.In APK, when we program to play media(Audio/Video) file
            mMediaPlayer = new MediaPlayer();
            mMediaPlayer.setDataSource(path);
            mMediaPlayer.prepare();
            mMediaPlayer.start();

2.In Media Framework : android_src\framework\basemedia\libmediaplayerservice\MediaPlayerService.cpp

Here, the MediaPlayerService, is the interface with MediaPlayer in Android SDK

extmap FILE_EXTS [] =  {
        {".mid", SONIVOX_PLAYER},
        {".midi", SONIVOX_PLAYER},
        {".smf", SONIVOX_PLAYER},
        {".xmf", SONIVOX_PLAYER},
        {".imy", SONIVOX_PLAYER},
        {".rtttl", SONIVOX_PLAYER},
        {".rtx", SONIVOX_PLAYER},
        {".ota", SONIVOX_PLAYER},
};


status_t MediaPlayerService::Client::setDataSource(
        const char *url, const KeyedVector *headers)
{


        player_type playerType = getPlayerType(url);
        LOGV("player type = %d", playerType);


        // create the right type of player
        sp p = createPlayer(playerType);


        mStatus = p->setDataSource(url, headers);


}

We can see that the method "setDataSource()" here want to get a player by using  "getPlayerType(url)"

player_type getPlayerType(const char* url)
{


    if (!strncasecmp("http://", url, 7)
            || !strncasecmp("https://", url, 8)) {
        size_t len = strlen(url);
        if (len >= 5 && !strcasecmp(".m3u8", &url[len - 5])) {
            return NU_PLAYER;
        }


        if (strstr(url,"m3u8")) {
            return NU_PLAYER;
        }
    }


    if (!strncasecmp("rtsp://", url, 7)) {
        return NU_PLAYER;
    }


int lenURL = strlen(url);
    for (int i = 0; i < NELEM(FILE_EXTS); ++i) {
        int len = strlen(FILE_EXTS[i].extension);
        int start = lenURL - len;
        if (start > 0) {
            if (!strncasecmp(url + start, FILE_EXTS[i].extension, len)) {
                return FILE_EXTS[i].playertype;
            }
        }
    }


    return getDefaultPlayerType();
}



static player_type getDefaultPlayerType() {
    return STAGEFRIGHT_PLAYER;
}




static sp createPlayer(player_type playerType, void* cookie,
        notify_callback_f notifyFunc)
{
    sp p;
    switch (playerType) {
        case SONIVOX_PLAYER:
            LOGD(" create MidiFile");
            p = new MidiFile();
            break;
        case STAGEFRIGHT_PLAYER:
            LOGD(" create StagefrightPlayer");
            p = new StagefrightPlayer;
            break;
        case NU_PLAYER:
            LOGD(" create NuPlayer");
            p = new NuPlayerDriver;
            break;
        case TEST_PLAYER:
            LOGD("Create Test Player stub");
            p = new TestPlayerStub();
            break;
        default:
            LOGD("Unknown player type: %d", playerType);
            return NULL;
    }

In Media Framework :android_src\frameworks\base\include\media\MediaPlayerInterface.h

enum player_type {
    PV_PLAYER = 1,//legacy of OpenCORE
    SONIVOX_PLAYER = 2,
    STAGEFRIGHT_PLAYER = 3,
    NU_PLAYER = 4,
    // Test players are available only in the 'test' and 'eng' builds.
    // The shared library with the test player is passed passed as an
    // argument to the 'test:' url in the setDataSource call.
    TEST_PLAYER = 5,
};




There are three player to handle the URL:
1.NuPlayer - If URL start with http/https and contains m3u8, or start with rtsp
2.SONIVOX_PLAYER - if the file extension is one of {".mid",".midi",".smf",".xmf",".imy",".rtttl",".rtx",".ota"}
3.StagefrightPlayer - the remaining types

In Android 2.3, there are only StagefrightPlayer and SoniVox Player

NuPlayer first appears in android 3.x version, used for Apple's HTTP Live Streaming standard

Let focus on StagefrightPlayer and NuPlayer

Start from Stagefright

---===Stagefright:===---


In Media Framework :android_src\framework\base\media\libmediaplayerservice\StagefrightPlayer.cpp

StagefrightPlayer::StagefrightPlayer()
    : mPlayer(new AwesomePlayer) {
    LOGV("StagefrightPlayer");


    mPlayer->setListener(this);
}

The constructor of StagefrightPlayer will first new AwesomePlayer, store as mPlayer

In Media Framework :android_src\framework\base\media\libstagefright\AwesomePlayer.cpp

AwesomePlayer::AwesomePlayer()
      mTimeSource(NULL),
      mAudioPlayer(NULL),
      mVideoBuffer(NULL),
      mLastVideoTimeUs(-1),
 {


    mVideoEvent = new AwesomeEvent(this, &AwesomePlayer::onVideoEvent);
    mVideoEventPending = false;
    mStreamDoneEvent = new AwesomeEvent(this, &AwesomePlayer::onStreamDone);
    mStreamDoneEventPending = false;
    mBufferingEvent = new AwesomeEvent(this, &AwesomePlayer::onBufferingUpdate);
    mBufferingEventPending = false;
    mVideoLagEvent = new AwesomeEvent(this, &AwesomePlayer::onVideoLagUpdate);
    mVideoEventPending = false;
    mCheckAudioStatusEvent = new AwesomeEvent(this, &AwesomePlayer::onCheckAudioStatus);


}

In constructor of AwesomePlayer, some AwesomeEvent will be registered, those events will be triggered at some proper time by TimedEvenQueue. It's why the behavior of AwesomePlayer called "Event Driven"

struct AwesomeEvent : public TimedEventQueue::Event {
    AwesomeEvent(
            AwesomePlayer *player,
            void (AwesomePlayer::*method)())
        : mPlayer(player),
          mMethod(method) {
    }
}

In Media Framework :android_src\framework\base\media\libstagefright\include\TimedEventQueue.h

struct TimedEventQueue {
void start();


    // Stop executing the event loop, if flush is false, any pending
    // events are discarded, otherwise the queue will stop (and this call
    // return) once all pending events have been handled.
    void stop(bool flush = false);


    // Posts an event to the front of the queue (after all events that
    // have previously been posted to the front but before timed events).
    event_id postEvent(const sp &event);


    event_id postEventToBack(const sp &event);


    // It is an error to post an event with a negative delay.
    event_id postEventWithDelay(const sp &event, int64_t delay_us);


    // If the event is to be posted at a time that has already passed,
    // it will fire as soon as possible.
    event_id postTimedEvent(const sp &event, int64_t realtime_us);


    // Returns true iff event is currently in the queue and has been
    // successfully cancelled. In this case the event will have been
    // removed from the queue and won't fire.
    bool cancelEvent(event_id id);
    static int64_t getRealTimeUs();


}

Continue to setDataSource from StagefrightPlayer, note that now mPlayer is an instance of AwesomePlayer

In Media Framework :android_src\framework\base\media\libmediaplayerservice\StagefrightPlayer.cpp

status_t StagefrightPlayer::setDataSource(
        const char *url, const KeyedVector *headers) {
    return mPlayer->setDataSource(url, headers);
}

In Media Framework :android_src\framework\base\media\libstagefright\AwesomePlayer.cpp

status_t AwesomePlayer::setDataSource(
        int fd, int64_t offset, int64_t length) {
 
    sp dataSource = new FileSource(fd, offset, length);
    return setDataSource_l(dataSource);
}


status_t AwesomePlayer::setDataSource_l(
        const sp &dataSource) {
    sp extractor = MediaExtractor::Create(dataSource);


    return setDataSource_l(extractor);
}

The MediaExtractor will analyze the content of media file and dispatch the data source to the proper extractor for advanced analysis.

In Media Framework :android_src\framework\base\media\libstagefright\MediaExtractor.cpp

sp MediaExtractor::Create(
        const sp &source, const char *mime) {


MediaExtractor *ret = NULL;
    if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_MPEG4)
            || !strcasecmp(mime, "audio/mp4")) {
        ret = new MPEG4Extractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_MPEG)) {
        ret = new MP3Extractor(source, meta);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_AMR_NB)
            || !strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_AMR_WB)) {
        ret = new AMRExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_FLAC)) {
        ret = new FLACExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_WAV)) {
        ret = new WAVExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_OGG)) {
        ret = new OggExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_MATROSKA)) {
        ret = new MatroskaExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_MPEG2TS)) {
        ret = new MPEG2TSExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_WVM)) {
        ret = new WVMExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_AAC_ADTS)) {
        ret = new AACExtractor(source);
    } else if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_MPEG2PS)) {
        ret = new MPEG2PSExtractor(source);
    }


    return ret;
}

Take an example of H.264 video with MP4 container, MPEG4Extractor will be newed

In Media Framework :android_src\framework\base\media\libstagefright\MPEG4Extractor.cpp

MPEG4Extractor::MPEG4Extractor(const sp &source)
    : mDataSource(source),
      mHasVideo(false),
      mFileMetaData(new MetaData),
        {
}

In Media Framework :android_src\framework\base\media\libstagefright\AwesomePlayer.cpp

status_t AwesomePlayer::setDataSource_l(const sp &extractor) {


 for (size_t i = 0; i < extractor->countTracks(); ++i) {
        sp meta = extractor->getTrackMetaData(i);
        if (!haveVideo && !strncasecmp(mime.string(), "video/", 6)) {
            setVideoSource(extractor->getTrack(i));
            haveVideo = true;
        else if (!haveAudio && !strncasecmp(mime.string(), "audio/", 6)) {
            setAudioSource(extractor->getTrack(i));
            haveAudio = true;


return OK;
}

In above code, AwesomePlayer use MPEG4Extractor to split the Audio and Video tracks

In Media Framework :android_src\framework\base\media\libstagefright\MPEG4Extractor.cpp

sp MPEG4Extractor::getTrack(size_t index) {
    return new MPEG4Source(
            track->meta, mDataSource, track->timescale, track->sampleTable);
}


MPEG4Source::MPEG4Source(
        const sp &format,
        const sp &dataSource,
)
    : mFormat(format),
      mDataSource(dataSource),
      mIsAVC(false),
      mNALLengthSize(0),
      mWantsNALFragments(false),
      mSrcBuffer(NULL) {
    const char *mime;
    mIsAVC = !strcasecmp(mime, MEDIA_MIMETYPE_VIDEO_AVC);
}

The Audio and Video tracks then wrapped as MPEG4Source, returned to AwesomePlayer

In Media Framework :android_src\framework\base\media\libstagefright\AwesomePlayer.cpp

void AwesomePlayer::setVideoSource(sp source) {
    mVideoTrack = source;
}


void AwesomePlayer::setAudioSource(sp source) {
    mAudioTrack = source;
}

After setVideoSource and setAudioSource, now MediaPlayerService will allocate the SurfaceTexture for future video rendering (connect to surfaceflinger)

In Media Framework :android_src\framework\basemedia\libmediaplayerservice\MediaPlayerService.cpp

status_t MediaPlayerService::Client::setVideoSurfaceTexture(
        const sp& surfaceTexture)
{
    LOGV("[%d] setVideoSurfaceTexture(%p)", mConnId, surfaceTexture.get());


    sp p = getPlayer();


    sp anw;


    if (surfaceTexture != NULL) {
        anw = new SurfaceTextureClient(surfaceTexture);
        status_t err = native_window_api_connect(anw.get(),
                NATIVE_WINDOW_API_MEDIA);
    }


    status_t err = p->setVideoSurfaceTexture(surfaceTexture);


    return err;
}


2 則留言:

anotherboy 提到...

Hi it is fun to reading your blog.
Stagefright tutorial is awesome.
can you provide a similar code flow for AudioFlinger.

sda 提到...

townasticsx