Presented by NYC MEdia Lab Sponsored by MLBAM
New York City’s largest born-and-bred tech startup, MLBAM is a full service solutions provider delivering world-class digital experiences for more than 16 years and distributing content through all forms of interactive media. Its digital leadership and capabilities are a direct result of an appreciation for designing dynamic functionality for web, mobile applications, and connected devices while integrating live and on-demand multimedia, providing valuable products for millions of fans around the globe. MLBAM also develops, deploys and distributes the highest-grossing sports app, At Bat, as well as manages live video content for dozens of sports, news and entertainment clients through its technology subsidiary, BAMTech. It captures, encodes and distributes tens of thousands of live video events annually, powering more live events on the Internet than any other property in the world.
About the Hackathon
In Fall 2016, NYC Media Lab presented a lineup of meetups when MLBAM shared its current experiments in data visualization, player tracking, big data analysis for recommender systems and fraud detection, and virtual reality. To build off the buzz from the meetups, NYC Media Lab presents the MLBAM Hackathon.
The hackathon will gather teams to build hybrid multimedia experiences and applications that explore the opportunity for sports content consumption, interaction and engagement. Areas for exploration include: VR, AR, predictive algorithms, machine learning, data visualization storytelling, projection mapping, livestream video, motion capture, 3D modelling, game design, animation, wearable technologies, and body-as-platform experiences.
NYU MAGNET, 2 MetroTech, 8th Floor, Brooklyn NY 11201
Friday, February 10th (6pm - 9pm) First Pitch
Saturday, February 11th (9pm - 9pm) Hacking
Sunday, February 12th (9am-2pm) Presentations
Hackathon teams will choose one track. Each track has specific challenge questions that teams will be expected to consider when submitting an application regarding their prototype idea:
Data visualization for big data track
1. Consider various data, such as transactional data from ticket sales, ballpark attendance data, At Bat app accesses, MLB.com web traffic data, and social media data. How can we visualize MLB user/fan engagement levels during a specified Opening Day?
2. How can we use various data sources to tell stories about baseball fans and how they engage across various platforms?
Virtual reality, augmented reality, mixed reality, 360 video track
1. How can you enforce a narrative with a camera that records 360 degrees?
Traditional cameras don’t capture a 360 perspective and the narrative is clear to follow. When the user has 360 degrees of view for a sports experience, how can you specify where the sports fan needs to focus to understand the narrative?
2. What type of story could you tell about a team/play/player with data that surrounds you?
This question relates to statistics and data visualization. In virtual reality, how can the sports fan experience the most exciting parts of a game, which usually happens in microseconds? Oftentimes, the best part of watching or experiencing a game does not happen in 20 seconds of play or in an hour. How can you tell a story about discrete moments in time? What cues (ex: haptic? audio? Interactive? visual?) can help the viewer?
3. How will you interact with other fans when we’re all wearing funny goggles on our heads?
This question relates to social experiences. Whether a viewer is at a stadium or on the couch watching the game, how can the viewer be aware of other people in VR? How can they interact with each other?
4. What approach is better? Totally synthetic sports representations (i.e. looks like a videogame with rendered content) or reality with synthetic content on top (i.e. looks like tv with enhanced displays over broadcast, but better)? What are the pros and cons to each?
This question relates to visualization of content. When is it better to produce a rendered experience (you can even consider avatars) versus to have an experience of live broadcast with overlays? Consider how Statcast is currently used to enhance the sports fan’s experience.
5. How can we get closer to the action without degrading the quality of the picture?
It is not allowed to have cameras on the field. Typically, cameras are taking video of what’s happening on the field from different heights and distances. What are new ways that a camera could capture and follow action on the field, with feeds that are comfortable for the sports fan to experience?
6. Can you find a way to make content on mobile VR look awesome without melting the camera?
This question relates to the technical specs of the hardware for mobile VR. Consider how phones heat up easily and how the battery life is limited. Rendered content requires more power to deliver, and is harder to produce, whereas 360 video has limitations as well.
Prizes for winning teams
- Grand Prize: $5,000
- First Prize: $2,500
- Second Prize: $1,000
- Third Prize: $500
Additional Prize: Tour of MLBAM’s headquarters in New York’s historic Chelsea Market.
Meet leaders from mlbam
- Dirk Van Dall, VP, Technology Development
- Don Vu, VP, Data and Advanced Analytics
- Alexander Reyna, Creative Director, Games
- More to be announced...
Apply to attend the hackathon
The goal for sourcing applications is to make sure teams consider the challenge questions and their prototype idea ahead of attending the hackathon.
Applications are rolling until Friday, February 3rd, 2017.
If you have any questions on this hackathon, contact Amy Chen at firstname.lastname@example.org