Add spatial tracking to 3D objects and get results
Tutorial showing attention data insights for AR/VR objects
Coffee, like spatial computing, is a universal passion for most XR developers so let’s look at three coffee machines as example 3D objects and apply spatial tracking to deliver attention insights. The objects could be product demos (retail, automotive etc), virtual try-ons (fashion, furniture etc), digital twins (manufacturing, construction etc) or training, advertising and game experiences. For these coffee machines, we want to know which one performs best, how we can optimise the user experience and what actions will drive business objectives. Spatial tracking and the AR/VR attention data it generates can be applied to all 3D objects whether it’s individual AR models or multiple objects in a VR scene. So, now we have your attention, let’s jump in, show you how it works and the type of attention data insights you get.
Add CORTEXR spatial tracking plug-in to 3D engine
Our plug-in for 3D engines (Unity, A-frame etc) adds spatial tracking on 3D objects without needing code or data analytics experience. Setting up spatial tracking and configuring 3D objects takes less than 10 minutes for developers with intermediate experience of your preferred XR platform. The plug-in places a cube around the 3D object which, in this case, is scaled according to the dimensions of the coffee machine with each face of the object generating attention data based on where users looked. Multiple 3D objects can be tracked in a scene (e.g. kitchen environment with various objects) but today we’re focused on individual 3D models and the insights our XR data analytics platform generates. Attention data outputs are similar to eye tracking, but infinitely more scalable, with spatial tracking revealing deeper insights about spatial behaviour in spatial environments compared to standard web data analytics tools like Google and Unity.
Identify best performing object with highest attention levels
Almost 20% more attention is given to the red coffee machine so people found this 3D object more appealing as it held their attention more than the yellow and white models. Attention Total is the amount of attention users give to the 3D model with data split by each face visible to users (front, right, back, left, top) and we’ve added the brand logo as it’s important to this project’s objectives. Attention Distribution shows the percentage split to analyse attention for each face and how this differs for each 3D object. This extra data insight layer of spatial tracking on 3D objects on top of standard data analytics (dwell time, interactions etc) gives you a complete picture on user behaviour to improve the performance of your AR/VR project, whether it’s pre-testing, prototyping, A/B testing, live monitoring or post analysis.
Insights and actions:
Identify red as winning object so placement/position early in the experience will increase overall user engagement and e-commerce ‘buy now’ funnel metrics.
Optimise attention across all sides of 3D objects to improve interactions with areas (like the right side) which get attention but have no user actions or events.
Analyse successful elements and dial up these features e.g. increase performance of red model by adding coffee cups on the top as this works well on the yellow model.
Analyse user behaviour around 3D objects with attention heatmaps
Over 70% of user attention (on average) is focused on the front face of the 3D objects as people spend time understanding functional controls and ergonomic features of the coffee machines. Our Attention Areas heatmap shows spatial tracking on 3D objects in an unboxed cube to examine attention on each face of the 3D model and – importantly – if user attention drifts outside the 3D object. Heatmaps in our prebuilt dashboards are more than pretty pictures as hovering over each cell shows data values to let you dig into specific elements, export data and analyse the pattern of user attention and its impact on project objectives.
Insights and actions:
Identify high visual attention areas on each face of the object (e.g. functional controls on front) to ensure UX design and UI interactions meet objectives.
Optimise low attention areas with nascent appeal (e.g. coffee cups and water tank) to drive exploration around 3D objects to increase dwell time and engagement.
Analyse hierarchy of attention against commercial goals (e.g. brand logo receives only 5% of attention) to improve brand recall, call-to-action buttons etc.
Optimise order of events with Attention Priority maps
User attention is mainly on the front of these objects as they see the front first and aren’t really motivated to explore all sides of the object. This could be an issue if important information is being missed on the right/back/left/top faces but it’s a successful result for this project as aesthetics on a kitchen counter and ergonomics of functional elements are the main business goals. Attention Priority sequence maps show the order of user attention around 3D objects to reveal additional insights (white is first view whilst purple is last view). Here we dig into the results on the front face to understand what’s going on. First view is often most important but last view is sometimes more interesting as it adds another layer of diagnosis to improve project performance.
Insights and actions:
Analyse intuitive user behaviour against intended content experience (e.g. control panel ranks first as ergonomics is primary objective) so this is a successful result.
Optimise narrative and user flow around objects (e.g. users moved around to the right because nozzle position on the right led their attention in that direction).
Identify unexpected orders of attention (e.g. logo gets attention early in the experience despite having low overall attention) so brand validation is important.
The metrics of human behaviour with spatial data analytics
Analysing human behaviour with 3D spatial content doesn’t always follow the UX/UI design when an AR/VR experience is released into the wild. User behaviour in 6DOF spatial environments can be unpredictable and surprising so spatial tracking on 3D objects help you build, test, monitor, analyse and grow your XR projects. Are people viewing the areas you expect them to, which areas do they look at first (versus last) and does this support your business objectives? CORTEXR data analytics platform gives you the answers so – combined with standard data analytics (Google, Unity etc) – you have the full picture to grow your business.