Andriod4.2 Camera 架构与实现
1.Camera架构包括客户端和服务端,他们之间的通信采用Binder机制实现。
Camera的实现主要包括本地代码和Java代码两个层次:
Camera本地框架:
frameworks/native/include/ui
frameworks/native/libs/ui
frameworks/av/camera/
Camera的本地实现包含在上述目录中,这部分内容被编译生成库libui.so和libcamera_client.so。
Camera服务部分:
frameworks/av/services/camera/libcameraservice
这部分编译生成libcameraservice.so。
Camera HAL:
frameworks/av/camera
frameworks/av/services/camera/libcameraservice/CameraHardwareInterface.h
CameraHardwareInterface.h是HAL接口的定义,需要各个系统根据自己的情况实现。
2.AndroidCamera采用C/S架构,client与server两个独立的线程之间使用Binder通信。这里将介绍Camera从设备开机,到进入相机应用是如何完成初始化工作的。
首 先既然Camera是利用binder通信,它肯定要将它的service注册到ServiceManager里面,以备后续Client引用,那么这一 步是在哪里进行的呢?在frameworks/av/media/mediaserver/main_mediaserver.cpp下有个main函 数,可以用来注册媒体服务。在这里,CameraService完成了服务的注册。
intmain(int argc, char** argv)
{
sp<ProcessState>proc(ProcessState::self());
sp<IServiceManager> sm= defaultServiceManager();
ALOGI("ServiceManager:%p", sm.get());
AudioFlinger::instantiate();
MediaPlayerService::instantiate();
CameraService::instantiate();
AudioPolicyService::instantiate();
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
可是我们到CameraService文件里面却找不到instantiate()这个函数,它在哪?继续追到它的一个父类BinderService
template<typenameSERVICE>
classBinderService
{
public:
static status_t publish(bool allowIsolated =false) {
sp<IServiceManager>sm(defaultServiceManager());
returnsm->addService(String16(SERVICE::getServiceName()), new SERVICE(),allowIsolated);
}
static void publishAndJoinThreadPool(boolallowIsolated = false) {
sp<IServiceManager>sm(defaultServiceManager());
sm->addService(String16(SERVICE::getServiceName()),new SERVICE(), allowIsolated);
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
staticvoid instantiate(){ publish(); }
static status_t shutdown() {
return NO_ERROR;
}
};
可以发现在publish()函数中,CameraService完成服务的注册。这里面有个SERVICE,源码中有说明
template<typenameSERVICE>
这表示SERVICE是个模板,这里是注册CameraService,所以可以用CameraService代替
sm->addService(String16(SERVICE::getServiceName()),new SERVICE(), allowIsolated);
这样,Camera就在ServiceManager完成服务注册,提供给client随时使用。
Main_MediaServer主函数由init.rc在启动是调用,所以在设备开机的时候Camera就会注册一个服务,用作binder通信。
servicemedia /system/bin/mediaserver
class main
user media
group audio camera inetnet_bt net_bt_admin net_bw_acct drmrpc
ioprio rt 4
Binder服务已注册,那接下来就看看client如何连上server端,并打开camera模块。
咱 们先从testingcameraapp的源码入手。在setUpCamera()函数中专门有一个open(mCameraId)函数进入 framework层,调用frameworks/base/core/java/android/hardware/Camera.java类的 open方法。
publicstatic Camera open(int cameraId) {
return newCamera(cameraId);
}
这里调用了Camera的构造函数,在看看构造函数
Camera(int cameraId) {
mShutterCallback = null;
mRawImageCallback = null;
mJpegCallback = null;
mPreviewCallback = null;
mPostviewCallback = null;
mZoomListener = null;
Looper looper;
if ((looper =Looper.myLooper()) != null) {
mEventHandler = newEventHandler(this, looper);
} else if ((looper =Looper.getMainLooper()) != null) {
mEventHandler = newEventHandler(this, looper);
} else {
mEventHandler = null;
}
native_setup(newWeakReference<Camera>(this), cameraId);
}
好,终于来到JNI了,继续看camera的JNI文件frameworks/base/core/jni/android_hardware_Camera.cpp
// connect to cameraservice
static voidandroid_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
jobject weak_this,jint cameraId)
{
sp<Camera>camera = Camera::connect(cameraId);
if (camera == NULL) {
jniThrowRuntimeException(env,"Fail to connect to camera service");
return;
}
// make sure camerahardware is alive
if(camera->getStatus() != NO_ERROR) {
jniThrowRuntimeException(env,"Camera initialization failed");
return;
}
jclass clazz =env->GetObjectClass(thiz);
if (clazz == NULL) {
jniThrowRuntimeException(env,"Can't find android/hardware/Camera");
return;
}
// We use a weakreference so the Camera object can be garbage collected.
// The reference isonly used as a proxy for callbacks.
sp<JNICameraContext>context = new JNICameraContext(env, weak_this, clazz, camera);
context->incStrong(thiz);
camera->setListener(context);
// save context inopaque field
env->SetIntField(thiz,fields.context, (int)context.get());
}
JNI函数里面,我们找到CameraC/S架构的客户端了,它调用connect函数向服务器发送连接请求。JNICameraContext这个类是一个监听类,用于处理底层Camera回调函数传来的数据和消息
看看客户端的connect函数有什么
===>>>frameworks/av/camera/Camera.cpp
sp<Camera>Camera::connect(int cameraId)
{
ALOGV("connect");
sp<Camera> c = new Camera();
const sp<ICameraService>& cs =getCameraService();
if (cs != 0) {
c->mCamera = cs->connect(c,cameraId);
}
if (c->mCamera != 0) {
c->mCamera->asBinder()->linkToDeath(c);
c->mStatus = NO_ERROR;
} else {
c.clear();
}
return c;
}
constsp<ICameraService>& cs =getCameraService();通过getCameraService()函数获取一个Camera服务实例。
// establish binder interface to cameraservice
const sp<ICameraService>&Camera::getCameraService()
{
Mutex::Autolock _l(mLock);
if (mCameraService.get() == 0) {
sp<IServiceManager> sm =defaultServiceManager();
sp<IBinder> binder;
do {
binder =sm->getService(String16("media.camera"));
if (binder != 0)
break;
ALOGW("CameraServicenot published, waiting...");
usleep(500000); // 0.5 s
} while(true);
if (mDeathNotifier == NULL) {
mDeathNotifier = newDeathNotifier();
}
binder->linkToDeath(mDeathNotifier);
mCameraService= interface_cast<ICameraService>(binder);
}
ALOGE_IF(mCameraService==0, "noCameraService!?");
return mCameraService;
}
可以看出,该CameraService实例是通过binder获取的,由binder机制可以知道,该服务就是CameraService一个实例。
c->mCamera= cs->connect(c, cameraId);
然后执行服务端的connect()函数,并返回一个ICamera对象赋值给Camera的mCamera,服务端connect()返回的其实是它内部类client的一个实例。
sp<ICamera>CameraService::connect(
const sp<ICameraClient>&cameraClient, int cameraId) {
int callingPid = getCallingPid();
LOG1("CameraService::connect E(pid %d, id %d)", callingPid, cameraId);
if(!mModule) {
ALOGE("Camera HAL module not loaded");
return NULL;
}
sp<Client> client;
if (cameraId < 0 || cameraId >=mNumberOfCameras) {
ALOGE("CameraService::connectX (pid %d) rejected (invalid cameraId %d).",
callingPid, cameraId);
return NULL;
}
char value[PROPERTY_VALUE_MAX];
property_get("sys.secpolicy.camera.disabled", value, "0");
if (strcmp(value, "1") ==0) {
// Camera is disabled byDevicePolicyManager.
ALOGI("Camera is disabled.connect X (pid %d) rejected", callingPid);
return NULL;
}
Mutex::Autolock lock(mServiceLock);
if (mClient[cameraId] != 0) {
client =mClient[cameraId].promote();
if (client != 0) {
if(cameraClient->asBinder() ==client->getCameraClient()->asBinder()) {
LOG1("CameraService::connect X (pid %d) (the same client)",
callingPid);
return client;
} else {
ALOGW("CameraService::connect X (pid %d) rejected (existingclient).",
callingPid);
return NULL;
}
}
mClient[cameraId].clear();
}
if (mBusy[cameraId]) {
ALOGW("CameraService::connectX (pid %d) rejected"
" (camera %d isstill busy).", callingPid, cameraId);
return NULL;
}
struct camera_info info;
if(mModule->get_camera_info(cameraId, &info) != OK) {
ALOGE("Invalid camera id%d", cameraId);
return NULL;
}
int deviceVersion;
if(mModule->common.module_api_version ==CAMERA_MODULE_API_VERSION_2_0) {
deviceVersion =info.device_version;
} else {
deviceVersion =CAMERA_DEVICE_API_VERSION_1_0;
}
switch(deviceVersion) {
caseCAMERA_DEVICE_API_VERSION_1_0:
client = new CameraClient(this,cameraClient, cameraId,
info.facing,callingPid, getpid());
break;
caseCAMERA_DEVICE_API_VERSION_2_0:
client = newCamera2Client(this, cameraClient, cameraId,
info.facing,callingPid, getpid());
break;
default:
ALOGE("Unknown cameradevice HAL version: %d", deviceVersion);
return NULL;
}
if(client->initialize(mModule) != OK) {
return NULL;
}
cameraClient->asBinder()->linkToDeath(this);
mClient[cameraId] = client;
LOG1("CameraService::connect X(id %d, this pid is %d)", cameraId, getpid());
return client;
}
在函数client->initialize(mModule)中实例化CameraHal接口hardware,hardware调用initialize()进入HAL层打开Camear驱动
status_tCameraClient::initialize(camera_module_t *module) {
int callingPid = getCallingPid();
LOG1("CameraClient::initialize E (pid %d, id %d)",callingPid, mCameraId);
char camera_device_name[10];
status_t res;
snprintf(camera_device_name, sizeof(camera_device_name), "%d",mCameraId);
mHardware= new CameraHardwareInterface(camera_device_name);
res =mHardware->initialize(&module->common);
if (res != OK) {
ALOGE("%s: Camera %d: unable to initialize device: %s(%d)",
__FUNCTION__, mCameraId, strerror(-res), res);
mHardware.clear();
return NO_INIT;
}
mHardware->setCallbacks(notifyCallback,
dataCallback,
dataCallbackTimestamp,
(void *)mCameraId);
// Enable zoom, error, focus, and metadata messages by default
enableMsgType(CAMERA_MSG_ERROR | CAMERA_MSG_ZOOM |CAMERA_MSG_FOCUS |
CAMERA_MSG_PREVIEW_METADATA |CAMERA_MSG_FOCUS_MOVE);
LOG1("CameraClient::initialize X (pid %d, id %d)",callingPid, mCameraId);
return OK;
}
hardware调用initialize()进入HAL层打开Camear驱动
status_t initialize(hw_module_t *module)
{
ALOGI("Opening camera %s", mName.string());
int rc = module->methods->open(module, mName.string(),
(hw_device_t**)&mDevice); //这一句作用就是打开Camera底层驱动
if (rc != OK) {
ALOGE("Could not open camera %s: %d",mName.string(), rc);
return rc;
}
initHalPreviewWindow();
return rc;
}
hardware->initialize(&mModule->common)中mModule模块是一个结构体camera_module_t,他是怎么初始化的呢?我们发现CameraService里面有个函数
voidCameraService::onFirstRef()
{
BnCameraService::onFirstRef();
if (hw_get_module(CAMERA_HARDWARE_MODULE_ID,
(const hw_module_t **)&mModule) < 0) {
ALOGE("Could not load camera HAL module");
mNumberOfCameras = 0;
}
else {
mNumberOfCameras = mModule->get_number_of_cameras();
if (mNumberOfCameras > MAX_CAMERAS) {
ALOGE("Number of cameras(%d) > MAX_CAMERAS(%d).",
mNumberOfCameras, MAX_CAMERAS);
mNumberOfCameras = MAX_CAMERAS;
}
for (int i = 0; i < mNumberOfCameras; i++) {
setCameraFree(i);
}
}
}
CameraService调用
hw_get_module(CAMERA_HARDWARE_MODULE_ID,(const hw_module_t **)&mModule)
来装载fakecamera HAL module:camera.duck.so
inthw_get_module(const char *id, const struct hw_module_t **module)
{
returnhw_get_module_by_class(id, NULL, module);
}
inthw_get_module_by_class(const char *class_id, const char *inst,
const struct hw_module_t **module)
{
int status;
int i;
const structhw_module_t *hmi = NULL;
charprop[PATH_MAX];
charpath[PATH_MAX];
charname[PATH_MAX];
if (inst)
snprintf(name, PATH_MAX, "%s.%s", class_id, inst);
else
strlcpy(name, class_id, PATH_MAX);
/*
* Here werely on the fact that calling dlopen multiple times on
* the same.so will simply increment a refcount (and not load
* a new copyof the library).
* We alsoassume that dlopen() is thread-safe.
*/
/* Loopthrough the configuration variants looking for a module */
for (i=0 ;i<HAL_VARIANT_KEYS_COUNT+1 ; i++) {
if (i <HAL_VARIANT_KEYS_COUNT) {
if(property_get(variant_keys[i], prop, NULL) == 0) {
continue;
}
snprintf(path, sizeof(path), "%s/%s.%s.so",
HAL_LIBRARY_PATH2, name, prop);
if(access(path, R_OK) == 0) break;
snprintf(path, sizeof(path), "%s/%s.%s.so",
HAL_LIBRARY_PATH1, name, prop);
if(access(path, R_OK) == 0) break;
} else {
snprintf(path, sizeof(path), "%s/%s.default.so",
HAL_LIBRARY_PATH1, name);
if(access(path, R_OK) == 0) break;
}
}
camera.duck.somodule 的代码在development/tools/emulator/system/camera/
====>development/tools/emulator/system/camera/Android.mk
LOCAL_SRC_FILES :=/
EmulatedCameraHal.cpp /
EmulatedCameraFactory.cpp /
EmulatedBaseCamera.cpp /
EmulatedCamera.cpp /
EmulatedCameraDevice.cpp /
EmulatedQemuCamera.cpp /
EmulatedQemuCameraDevice.cpp /
EmulatedFakeCamera.cpp /
EmulatedFakeCameraDevice.cpp /
Converters.cpp /
PreviewWindow.cpp /
CallbackNotifier.cpp /
QemuClient.cpp /
JpegCompressor.cpp /
EmulatedCamera2.cpp /
EmulatedFakeCamera2.cpp /
EmulatedQemuCamera2.cpp /
fake-pipeline2/Scene.cpp /
fake-pipeline2/Sensor.cpp /
fake-pipeline2/JpegCompressor.cpp
ifeq($(TARGET_PRODUCT),vbox_x86)
LOCAL_MODULE :=camera.vbox_x86
else
LOCAL_MODULE :=camera.duck
endif
了 解HAL层的都知道hw_get_module函数就是用来获取模块的Halstub,这里是通过CAMERA_HARDWARE_MODULE_ID获 取CameraHal层的代理stub,并赋值给mModule,后面就可通过操作mModule完成对Camera模块的控制。那么 onFirstRef()函数又是何时调用的?
onFirstRef()属于其父类RefBase,该函数在强引用sp新增引用计数时调用,什么意思?就是当有sp包装的类初始化的时候调用,那么camera是何时调用的呢?可以发现在
客户端发起连接时候
sp<Camera>Camera::connect(int cameraId)
{
LOGV("connect");
sp<Camera> c = new Camera();
constsp<ICameraService>& cs = getCameraService();
}
这个时候初始化了一个CameraService实例,且用Sp包装,这个时候sp将新增计数,相应的CameraService实例里面onFirstRef()函数完成调用。
CameraService::connect()返回client的时候,就表明客户端和服务端连接建立。Camera完成初始化,可以进行拍照和preview等动作。