ARFoundation系列讲解-08远程调试
像我们使用 ARCore XR Plugin For Unity 或者 ARKit XR Plugin For Unity 都有远程调试功能方便我们调试代码。而ARFoundation远程调试功能,正在开发当中。所以从 UnityAssetstore 中购买了一个第三方开发的 AR Foundation Editor Remote 远程调试工具,他可以通过Wi-Fi或有线连接支持AR功能的移动设备上进行调试。
(课程最后提供 AR Foundation Editor Remote 下载链接)
一、官方介绍
⌛ Fast iterations are crucial for development. Unity Editor does not currently support AR mocking and testing so you're required to make a new build after any minor change. And builds take a looooong time even for small projects.
Now you have the solution!
AR Foundation Editor Remote is an Editor extension that allows you to transmit AR data from AR device to Unity Editor. Test AR projects right in the Editor!
⚡ Features ⚡
• Precisely replicates the behavior of a real AR device in Editor.
• Supports all AR Foundation platforms. Extensively tested with ARKit and ARCore.
• Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion runnning on your AR device. Extensively tested with scenes from AR Foundation Samples repository.
• Multi-touch input remoting: test multi-touch input in Editor or simulate touch with mouse (see Limitations).
• Written in pure C# with no third party libraries or native code. Adds no performance overhead in production. Full source code is available.
• Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS... any variation you can imagine!
• Supports wired connection on iOS + macOS.
⚡ Supported AR subsystems ⚡
• Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.
• Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).
• Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.
• Body Tracking: ARKit 2D/3D body tracking, scale estimation.
• Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.
• Image Tracking: supports mutable image library and replacement of image library at runtime (see Limitations on Android).
• Depth Tracking (ARPointCloudManager): feature points, raycast support.
• Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.
• Anchors (ARAnchorsManager): add/remove anchors, attach anchors to detected planes.
• Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode (see Limitations).
• Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.
• Raycast subsystem: raycast against detected planes and cloud points (see Limitations).
💡 Requirements 💡
• Unity >= 2019.2 (beta versions of Unity are not supported).
• AR Device (iPhone with ARKit support, Android with ARCore support, etc.).
• AR Device and Unity Editor should be on the same Wi-Fi network (wired connection is supported on iOS + macOS).
• Stable version of AR Foundation >= 3.0.1.
💡 Plugin workflow 💡
1. Please read the Documentation located at Assets/Plugins/ARFoundationRemoteInstaller/DOCUMENTATION.txt.
2. Build and run ARCompanion app on your AR device.
3. Enter ARCompanion app IP in plugin's settings.
4. Test and debug your AR project in Editor without the need to make a new build after every change (you will not see your recent changes in ARCompanion app. ARCompanion is used only to send AR data and touches back to the Editor).
5. Always test your project on a real AR device before releasing it to production.
👉 Limitations 👈
• Please check that your AR device supports AR feature you want to test in Editor. For example, to test Meshing in Editor, your AR device should support Meshing.
• Camera background video and occlusion:
- Default resolution scale is 0.33. You can increase the resolution in plugin's Settings, but this will result in higher latency and lower frames-per-second.
- Windows Editor 2019.2: video and occlusion are not supported.
- CPU images (TryAcquireLatestCpuImage, TryAcquireEnvironmentDepthCpuImage, etc.) are not supported. As an alternative, you can use Graphics.Blit() to copy textures and access them on CPU (see Texture2DSerializable.cs for example).
• Raycast subsystem: ARRaycastManager is implemented on top of ARPlaneManager.Raycast() and ARPointCloudManager.Raycast(). Please add ARPlaneManager to your scene to raycast against detected planes and ARPointCloudManager to raycast against detected cloud points.
• Touch input remoting and simulation in Unity 2019.2:
- Please add this line on top of every script that uses UnityEngine.Input:
using Input = ARFoundationRemote.Input;
- UI system will not respond to touch events. Please use your mouse to test your UI in Editor.
• Image tracking on Android:
- The plugin sends reference images from Editor and adds them on AR device at runtime with ScheduleAddImageJob(). While on iOS everything works perfectly fine, image tracking on Android works with delays (only applicable to AR Fondation 4 or newer). It seems like a bug in ARCore and, hopefully, it will be fixed in the future.
• Session subsystem:
- ARSession.subsystem.configurationChooser is not supported.
推荐学习资料
2.Unity官方API:学习一门技术,官方教程是最权威的
3.ARFoundation Samples : ARFoundation 示例地址
4.AR Foundation Editor Remote 下载链接: https://pan.baidu.com/s/1LFVtvdDCUeZJlebgZeqSyQ 提取码: ***** 前往CSDN获取
⚠️ 资源仅用于个人学习目的,严禁一切商业用途。
欢迎对AR技术感兴趣的朋友,加入QQ群:883655607 讨论