This week we had to present our Markerbased AR application. The purpose of the application was to show the European country on the map based on the flag detected by the camera. The presentation went good, but some of the flags did not work as good as others. Vuforia had some issues with detecting some of the flags due to contrast and lightening.
Later, during the course session, we were introduced to the Markerless AR applications. The main difference between Markerbased and Markerless AR applications is that in Markerless application there is no need to have marker (QR code or a custom one) in order for it to run properly. In addition, the teacher gave us the examples of common and advanced AR platforms. Some of this include the phone and several types of AR headsets. Their use can vary from use in games, to shows and even to informational material.
Real life examples:
- News:
To make the news for interactive and descriptive.
- Modeling pipeline:
To make a better design and check how it will look in real life.
- Entertainment
Games or shows.
During this week we also got the second assignment which is based on what I have described so far. We are still in the planning phase regarding the idea and scope of this assignment but likely there will be something along the lines of a museum visit or a hospital theme educational experience about the human body.
In order to develop the second assignment we will mainly use only ARCore which consists of 3 fundamental concepts: motion tracking, environmental understanding and light estimation.
Motion tracking takes care of combining the real world with the virtual one using two cameras: virtual and real life. This is possible with a process called SLAM (simultaneous localization and mapping) which uses feature points to calculate position or any change in it. Visual information captured by the camera is combined with the device's inertial measurements to estimate the pose of the camera relative to the world.
Environmental understanding improves the detection of feature points on planes (common horizontal or vertical surfaces, like tables and walls). ARCore will also make available to the app the boundaries of these planes which is useful to render virtual object resting on them.
Light estimation ensures that the virtual objects blend in with the real world environment as naturally as possible, by modifying contrast and providing color correction for the camera.
One last thing, we will use ARFoundation to implement all of this concepts and ideas in a functional application.
Author: Roksana Dziadowicz
Comments
Post a Comment