On show from Hitomi Broadcast (N2331) is a new latency feature for its MatchBox audio video alignment toolbox, aiming for complete timing quality control assurance. It measures the actual time of flight of video signals from the front of multiple cameras or at various points through the broadcast chain with milli-second accuracy. A measurement is taken by holding an iPhone up in shot running the free MatchBox Glass app. A signal is then sent back to the MatchBox Analyser located in an OB truck or MCR. The company says this simplifies the task of measuring latency, and additionally gives a lip-sync reading.
WT Vision (integrated with the Mediapro Group at N5420) is showing complete workflows to cover sports live production and is offering live demos of BasketballStats CG - Fiba Edition, specially designed to offer real-time graphics and official data for basketball competitions, as well as AR augmented reality software.
Codemill (W541) will be demonstrating the latest features of its Accurate.Video and Cantemo products. Latest updates to the former include the addition of workspaces for ad breaks, video, audio, subtitle and time-based metadata compliance, localisation and content versioning QC, as well as more support for dealing with audio. The latest version of Cantemo, now available as a subscription model, adds support for remote proxy editing workflows, using Adobe Premiere and Final Cut Pro X, enabling seamless integration between the editing tools and Cantemo. It also includes support for Vidinet APIs.
IBC is run by the industry, for the industry. Six leading international bodies own IBC, representing both exhibitors and visitors. Their insights ensure that the annual convention is always relevant, comprehensive and timely. It is with their support that IBC remains the leading international forum for everyone involved in content creation, management and delivery.
Carmen: Yeah, there is, definitely. We're trying to do machine learning in taking motion capture of different sign languages. There's actually more than one - a lot of people don't know that there's no universal sign language - well, there is, but it's not really used.
Carmen: It's a blessing and a curse, being a DevOps engineer. There's so many things you grab. I like the automation of infrastructure - creation of infrastructure - and creating resilience, like autoscaling and reliability. I like being an architect. They say this is our plan for now, but then you have to leave some leeway for the future, so it's kind of open-ended, right I want to create scalable and reliable infrastructure. Infrastructure as code is Terraform, Ansible, and Kubernetes obviously, but also at the same time, being able to leave that room so that we can continue the architecture of growth. The product will always change depending on how popular you get. I love working in the automation of our infrastructure and being able to create new customer infrastructure each and every time.
Carmen: I don't think there is an ideal path. Sometimes it's like, you just get into a place, right I didn't know I was going to be in academia and be in industry at the same time. It just so happened because I had another passion. If you think about when you go back to school 20 years ago and being a Latina, growing up without the same means as other people, the receipts had to be there, right So in my mind, and the way that I was raised 20 years ago - there were no bootcamps, there was no Facebook, there was none of that. There was no LinkedIn or Lynda, right We didn't really have that. Your only alternative was to go to a community college for your first two years, and maybe go to a four-year.
Carmen: Yeah! I didn't take risks. A lot of the decisions I made, like what college I went to and what industry I went into, were based on the need of taking my family out of the situation we were in monetarily - to be somewhere better. I didn't really choose finance, it kind of happened - thankfully, it just happened. I got really lucky that it happened at the time where, in the early 2000s, things were popping. The work I did was so fruitful and so engaging, but at the same time, I was making a good amount of money to even support my two siblings to go to private school. So I made decisions based on the needs of my family and not the need of taking risks and learning.
Carmen: It changes with the time and that's just like with everything, right I went to an all-girl high school in the southside of Chicago, and then I went to a college in a major where I was probably - almost 98% - the only female in class. I went from an all-girls school where we were all killing it, right AP calc, AP stats, AP whatever you want to call it, you were killing it, right Those are my sisters. And then I went to this major where I was usually the only female and it was awkward. Everyone's socioeconomic status was way the hell higher than they were in my high school. We were urban high school so everyone was pretty much working class. So that was different.
Basic coding doesn't have you do towers of Hanoi. You cannot give up on a subject just because the foundation seems super hard. Not every computer scientist took AP calc and got an A. It's okay if you suck at math, it's okay if you suck at certain sciences. Computer science is so overarching, you shouldn't give up on it. You just need to find your niche within it. If you really love tech, you'll find your niche. It'll take some time, but you'll find it. So don't give up.
Rox: Yeah. And honestly, I don't even know a single bootcamp that offers a DevOps course, because it is so complicated. Your average bootcamp will last, what, three months There's just not enough time to go over every area of DevOps like that. That needs to be a thing.
Carmen: Yes, you can. Sí se puedes. Whatever you're going through, cry it out, dance it out like in Grey's Anatomy, whatever you need to do. Just keep going forward. If this is really your passion, no matter how much the science bogs you down, or how much the area grouping of (sometimes) all men gets kind of crazy, you can get through it. There's always people that are helping to push you up and see you achieve greatness. So keep going. Sí se puedes.
However, when data are collected without the availability of a clear response (e.g. survival time, tumor size, cell growth) using multipledifferent technologies, data integration requires organizing patterns that enable interpretation. Clustering is often used as one unsupervised method that can use latent variables - for example using a categorical variable such as cell type which was not directly measured on the data but enables simple interpretations . Unfortunately, biological phenomenona are often not as clearcut.
During clustering, overseparating data by forcing the data into types only provides a static description when the variation should often be along a continuum. Indeed, although a latent factor can be a useful first approximation, the development of cells and their fate is a dynamic process. Thus, we recommend referring back to the original data that enabled interpretation of the cell trajectories: in our case, where the underlying latent variable of interest is expressed along a gradient of development (e.g. pseudo-time, disease progression).
As many methods suffer from identifiability issues, redundant biological knowledge can be enlightening. By providing information on the extreme points in a map or brushing a map with known gene expression features, one can delineate orientations and clusters. As an example, it is only through coloring by CD56 across time that we can see the dynamics of immune response , similar to the principle behind the interactive brushing illustrated in Figure 5C.
As such, no universal benchmark data scheme may suit every combination of modalities (e.g. mising cells design does not generalise to the spatial context), and benchmark datasets should be established for commonly used combinations of modalities or technologies towards specific data integration tasks.
Open-source software efforts facilitate a community-level coordinated approach to support collaboration rather than duplication of effort between groups working on similar problems. Real-time improvements to the tool-set should be feasible, respecting the needs for stability, reliability, and continuity of access to evolving components. To that end, exploration and engagement with all these tools is richly enabled through code sharing resources. Our hackathons directly leveraged through GitHub with our reproducible analyses reports to enable continuous integration of changes to source codes (using Github Action), and containerized snapshots of the analyses environments. The hackathons analyses conducted in R were assembled into R packages to facilitate libraries loading, while those conducted in Python enabled automatic installation and deployment
Finally, in the absence of universal standards, the metadata available may vary from modalities, or independent studies (e.g. spatial proteomics), thus urging the need from the computational biology community to define the minimum set of metadata variables necessary for each assay, as well as for pairs of assays to be comparable for common analyses.
The Mathematical Frameworks for Integrative Analysis of Emerging Biological Data Workshop demonstrated the power of hackathons to both inform and develop new analysis methods to capture the complex, multi-scale nature of biological datasets from high-throughput data modalities. Notably, the hackathon studies of the workshop were specifically designed to span state-of-the-art multi-omics challenges to map the epigenetic, molecular, and cellular interaction across time and sample populations. Single-cell measurements spanning molecular modalities can inherently simplify the challenge of linking disparate biological scales, but layering new sets of molecular measurements increases the