Posts tagged ‘Senior Director’

Rackspace::Solve New York: How Under Armour Solves For Rapid Ecommerce Growth

September 8, 2014 2:00 pm


By Brian McManus, Senior Director of Technology, Under Armour

When Under Armour started, we had just four web servers and a database server. Back then we were called KP Sports – named after our founder and CEO Kevin Plank.

We grew. And we grew quickly. We needed infrastructure – and a partner — that supported not only our ecommerce site’s high traffic (especially during seasonal spikes or a Super Bowl commercial) but also one that was always up and could grow right alongside us.

The Under Armour/Rackspace story is built on nearly a decade of growing together. When we signed on with Rackspace, we were a small startup making and shipping shirts. Rackspace was a scrappy hosting upstart. Together, we’ve grown in our respective industries and with each other. At Under Armour we’ve doubled-down with Rackspace because of our similar paths and cultures. We have parallel stories and our businesses are each built on doing the right thing for our customers.

Our journey from four servers to 200 is a testament to the foundation we’ve built on Rackspace. And we know Rackspace will be there for us to support our future growth and ensure we can continue to scale to meet increasing demands in the ultra-competitive ecommerce world.

Want to hear more about how Under Armour works with Rackspace to solve for high-demand ecommerce and growth? Brian McManus, senior director of technology at Under Armour will present at Rackspace::Solve New York, a one-day summit where you’ll hear directly from companies like Under Armour about how they’re solving tough challenges in their businesses. Rackspace::Solve New York is Thursday, September 18 at Cipriani Wall Street.

Register now for Rackspace::Solve New York.

And stay tuned for details of the next Rackspace::Solve event in Chicago.

Does 4K Need High Dynamic Range to Succeed?

August 18, 2014 12:06 pm

Does 4K Need High Dynamic Range to Succeed?

As mentioned in my previous blog, before the summer break, high dynamic range (HDR) has featured large on the agenda as being the missing piece of UHDTV armory to trigger commercial success for the format. Certainly the recent DVB-EBU trials in Munich showed impressive results, but how this technology will be realized, what production workflow can be used (particularly for live event coverage) and whether this technology can be launched in cost-effective TVs are yet to be answered.

Screens are evolving and prices tumbling to levels likely to attract buyers, who will no doubt feel they are purchasing now on enhanced web streaming ability and upscaling HD, as well as investing in a future-proof TV that will be able to decode and display 4K. By and large this is true, with most of the 2014 crop of screens supporting the latest version of the HDMI spec and an HEVC decoder capable of operating with the limited number of 4K streaming movie services available. Buyer beware though, for two reasons, there is still legacy functionality on the current “latest products” that will only become apparent when a 4K streaming service is applied to the screen. More importantly, we are still in the midst of a phased introduction of UHDTV, which could make a purchase now quickly lack the latest wow factor feature for 4K. HDR is one such feature, but there are likely to be more, an unfortunate aspect of companies drip feeding features to unsuspecting buyers. The lack of genuine 4K sources is disguising this fact from many early adopters, as is the limited number of viewers who have sufficient broadband bandwidth to sign up for Netflix’s 4K streaming service.

Should you not have 15-20 Mbps broadband connectivity, 1080p would be the fall back option, but is this really a second best option? Many of the HDR demonstrations were made in 1080p and as many of you who visited the Harmonic booth at IBC last year and NAB this past spring will know, I have long been an advocate of 1080p transmission of 4K sourced and displayed material at the bit rates recommended for 4K streaming services. This may seem counter intuitive at first that 1080p be preferred over native 4K transmission, but it is a credible stance. Of the few 4K streamed services available at the moment, many are in fact 1080p or showing 2K digital cinema content. Even though the bitrates recommended for 4K are high, they may not be enough to prove 4K supremacy over 1080p for demanding sports content. This factor will only become apparent when early adopters switch over from 4K streaming of movies to the much debated rollout of live UHDTV broadcasts.

What is certain is that HDR will not yet be factored into the current crop of screens that are clearly only targeted at 4K streaming services. Broadcast formats and specifications are still in a state of flux, and more worryingly, there appears to be no effort to merge the TV and cinema needs from a workflow perspective leaving 1080p to be the safest interim until UHDTV is fully sorted.

-Ian Trow, Senior Director, Emerging Technology and Strategy

10 Reasons Why Virtualization Will Happen in Broadcast

August 13, 2014 9:20 am


  1. Virtualization is now mainstream in data centers with up to 70% adoption. Broadcast architectures are now dominated by IT infrastructure so further rationalization within this sector will mean embracing a technology that is already dominant in the IT and enterprise sector.
  2. Video processing is heavily reliant on CPU and storage. This has traditionally meant bespoke products and solutions have been required to address the processing, bandwidth and storage requirements. The performance requirements of video are now down to the levels that are manageable on servers for all but the most demanding video applications.
  3. Traditional broadcast headends have been architected around technology implementations in products that result in a non-optimal partitioning of functionality.
  4. Redundancy in broadcast has all been about replicating functionality to achieve a high degree of resilience to failure within a system. This has led to significant overprovisioning which could be avoided through a more dynamic approach to resource allocation in the event of failure.
  5. Technology refresh in the broadcast domain is seldom about a “like for like” replacement these days, so a separation between the functionality implemented and the base hardware is logical to provide flexibility.
  6. Agility is now a key aspect in today’s media with scheduling and evolving playout platforms dictating a more flexible approach to deployments in order to adapt to media changes and content streaming needs.
  7. While undoubtedly the ability of servers to absorb functionality previously only available in dedicated hardware is a major initial draw to virtualization, the medium to long-term appeal is to layer a range of media functionality dynamically across virtual machines without leading to network or storage contention.
  8. Social media, audience preference and targeted advertising are all examples of analytical data that are essential complements to any form of programming today. These back office functions are already heavily virtualized to mine vast big data, transforming the unstructured into the essential drivers behind programming.
  9. Running multiple functions on virtual machines reduces port counts, interconnections, rack space and power.
  10. Bespoke broadcast infrastructure is expensive to support when compared to more commonplace IT/enterprise installations. Virtualization promotes moves to enable media specialists to concentrate on media content rather than the background infrastructure.

Interested in Knowing More?

I’ll be hosting a Virtualizing Video webinar on Thursday August 21st at 12 p.m. EDT in conjunction with TV Technology. Details on the free webinar are available here.

– Ian Trow, Senior Director, Emerging Technology and Strategy