
Kim Loza
Kim is a XR Software Engineer and a founder of Perditio Inc. - we create immersive experiences for social good. Perditio, founded in San Francisco, now has branches in Africa and teams across Latin America & Europe who share a passion for promoting DEI (diversity, equity & inclusion) through music and immersive experiences.
Kim previously worked creating immersive media experiences for Intel Sports' Immersive Media Processing & Experiences Team. She has experience in shipping 100+ AR/VR/MR consumer facing applications to Apps Stores. She is an active community board member for Latino Tech groups that work to encourage and support underrepresented minorities in pursuing opportunities in tech. Her technical interests include immersive media, artificial intelligence, and contributing to open source initiatives. She holds a B.S. in Computer Engineering from San Francisco State University and has 6+ pending patents on methods for implementing point-cloud content.
Technology feeds my mind, but music feeds my soul.
XR Software Engineer | Experience Curator | Perditio Inc. Cofounder
Career Before Perditio

March 2017 - December 2021
Kim spent 4+ years of her career as a XR Software Engineer creating immersive experiences for Intel's Intel Capital division that is a force multiplier for early-stage startups to build the companies of tomorrow.
During her time at Intel she had the opportunity to work on Project Alloy Intel's all-in-one HMD. She also helped ship 100+ live XR sports consumer apps for clients as the Olympics, TNT, PGA, NFL, MLB, LaLiga, and many more. Kim also served as technical expert in the XR space through contributions to OpenXR, MPEG-I efforts for emerging immersive media standards, publishing 3+ technical white papers on delivering object-based immersive media experiences in sports, and she filed 6+ patents on methods for implementing point-cloud content during her time at Intel.
Intel True View 360° Highlights: Minnesota Vikings vs. Green Bay Packers
March Madness : One of UVA's Biggest Plays in Intel's True View
Intel® True View (formerly freeD) Technology + NFL | Intel
Intel® True VR Brings You To PyeongChang | Intel
LALIGA INTEL® TRUE VIEW

Intel Sports Immersive Media Platform
Intel® True VR and Intel® True View share a common video processing and distribution pipeline that is hosted in the cloud.
The video processing component of the pipeline takes the uncompressed virtual camera videos from the Intel® True View capture and the panoramic video from the Intel® True VR capture then encodes the video sources.
The settings for stream types, codecs, compression quality, picture resolutions, and bitrates are highly dependent on the targeted end user experiences and the expected network conditions. The system is flexible to support current industry standard codecs such as AVC, HEVC, and M-JPEG, and is flexible to utilize upcoming immersive media standards in order to maintain support of a wide range of platforms and devices.
The stream packager takes the encoded videos and converts the memory bits into consumable bitstreams. The content distribution network (CDN) enables the distribution component of the pipeline to stream the content to client players.
On the client side, the stream de-packager takes and reads from the consumable bitstreams that feed the decoder. The client player’s video decoder then decompresses the video bitstream. Finally, the video renderer takes decoded video sequences and renders them based on user preference, into what is seen on the device.
ITU Journal: ICT Discoveries Publication
Published in ITU Journal: ICT Discoveries (Volume 3, Issue 1)
Publication date: 18 May 2020
Media and sports
Title
Delivering object-based immersive media experiences in sports
Abstract
Immersive media technology in sports enables fans to experience interactive, personalized content. A fan can experience the action in six degrees of freedom (6DoF), through the eyes of a player or from any desired perspective. Intel Sports makes deploying these immersive media experiences a reality by transforming the captured content from the cameras installed in the stadium into preferred volumetric formats used for compressing and streaming the video feed, as well as for decoding and rendering desired viewports to fans’ many devices. Object-based immersive coding enables innovative use cases where the streaming bandwidth can be better allocated to the objects of interest. The Moving Picture Experts Group (MPEG) is developing immersive codecs for streaming immersive video and point clouds. In this paper, we explain how to implement object-based coding in MPEG metadata for immersive video (MIV) and video-based point-cloud coding (V-PCC) along with the enabled experiences.
Keywords
Immersive media, MPEG-I, point-cloud, six degrees of freedom (6DoF), volumetric video
Authors
Fai Yeung | Basel Salahieh | Kimberly Loza | Sankar Jayaram | Jill Boyce
VR Industry Forum Guidelines Publication
Intel Sports : Immersive Media Experiences in Sports Guiding Example Use Case (sec. D.2, p.124)
VR Industry Forum Guidelines 2.1 · Feb 5, 2020
This version of the VR Industry Forum (VRIF) Guidelines builds upon version 2.0, in which Intel Sports submitted a white paper to VRIF's WG call for proposals to provide industry examples of producing and distributing Live VR services. The included section from the Intel Sports VRIF white paper can be found in sec. D.2, p. 124 of the guide.
2016 Intel announces untethered VR with Project Alloy (CNET News)
Career Before Intel
What is OSVR Open Source Virtual Reality
Other Immersive Experiences

Delectably Dark AR Filter
Try out Perditio's debut album Delectably Dark's Album Cover AR Filter. The filter is sound reactive.
Follow @perditioinc
MADE WITH SPARK AR

🖤🥀🖤 AR Filter
Pop and Reveal 🖤🥀🖤 #DarkGroove AR Filter available on Instagram and Facebook.
Follow @perditioinc
Made with Spark AR
Interactive 3D Models

Delectably Dark Album Cover Artwork

GLB Astronaut & USDZ iOS Astronaut
GLTF Dinosaur


2018 ⏰ 48HRS
GearVR Simulator GameCreated a VR Simulator Capstone Project for Udacity's Virtual Reality Mobile Performance & 360 Media Specialization Nanodegree Program.
StoryLine: (Inspired By True Events - Parody)
The time is ticking, there’s only 48 hours until the end of the semester and everything is riding on completing the ‘BIG FINAL PROJECT.’ Everything you’ve worked up for. Everything you’ve learned. It’s now show time! Not completing the project would mean that you’d fail the class. But there is one lingering question. What do you make?
Game Type:
An Interactive Simulator, Sandbox, Open World


2015 🐬 ORCA
Underwater ROV PrototypeWe wanted to create a commercially affordable remotely operated underwater vehicle that included features commonly reserved for industrial underwater drones.
This ROV is customizable, it can serve as an inspection tool, be used for research purposes, or even for small marine businesses.
The possible applications are endless.
CAPABILITIES
- Full range of motion
- 30ft Range
- Camera w/ Live Stream and Video Capture
- Over 2 hours of battery life
- Fully able to dissemble
- Customizable application caps
- Sensors Include: Accelerometer, Gyroscope, Depth, and Temperature Sensors.