Peter Symes, SMPTE’s director of standards and engineering, details the technical groups efforts at developing standards for 3-D television and discusses other work affecting broadcasting, including audio-video synchronization, aspect-ration conversion, closed captions, a new SMPTE time code, broadcast automation and ultrahigh-definition television.
By the time Americans are ready to buy their next big-screen HDTV sets, they may find that they can do more than just watch sparkling clear pictures and listen to CD-quality sound on them. With the aid of special glasses, they may also be able to enjoy a big game or the latest movies in 3-D, an enhancement that some believe is just as compelling as HD itself.
For that to happen, the TV and consumer electronics industries will have to settle on a standard for 3-D mastering and distribution. And, fortunately, that work is well underway by the Society of Motion Pictures and Television Engineers, better known by its acronym, SMPTE.
As he prepared for the SMPTE Annual Tech Conference & Expo, going on now at the Hollywood Renaissance Hotel in Los Angeles, Peter Symes, SMPTE’s director of standards and engineering, took time to discuss the 3-D effort and other work affecting broadcasting, including audio-video synchronization, aspect-ration conversion, closed captioning, a new SMPTE time code, broadcast automation and ultrahigh-definition television.
Symes began his long career in TV engineering at the British Broadcasting Corp., but spent the bulk of it (1983 to 2007) at Grass Valley in a number of positions invoving strategic planning, intellectual property and technological liaison. He represented the manufacturer at SMPTE, serving two terms as engineering vice president and one as financial vice president. In July 2007, he left Grass Valley to join the SMPTE staff.
An edited transcript:
What’s the status of the 3-D standardization effort?
We’ve got a group [Working Group on 3-D Home Master] chaired by Ted Szypulski of Disney and it will soon be having its first meeting. About 60 people have signed up from one end of the food chain to the other. An important part of this is not just going to be the images and the sound, but to make sure that we have the appropriate information for all the potential coding systems and distribution. This is going to need quite a complex set of metadata.
Will TV stations be able to broadcast 3-D if they choose?
Traditional, big stake RF broadcast is the biggest challenge because they’re the guys with the most limited bit rate and the biggest requirement for absolute compatibility with the existing standard. Cleary, satellite and cable and physical media have got bigger budgets when it comes to bringing out new stuff like this and those will be the predominant directions at first. Broadcasters will look to do something when they can.
But broadcasters will be able to broadcast 3-D, right?
Clearly the potential is there. The standard will have the necessary information to port whatever coding scheme broadcasters choose to use to get it done. The complicated part is what do you do when you only have a 2-D channel or how do you handle things like 2-D compatibility?
I’m sure it’s going to happen, but what I don’t know, given all the constraints on broadcasters financially, is when it will happen.
How much of a TV station’s digital channel will it take to broadcast 3-D?
My pure guess is, it’s probably about like adding an SD channel, but I’m absolutely not the expert on that. A lot of people are doing that. They’re doing HD plus SD multiplexes now. Of course, you could only do it when the capability is built into receivers. Today, we are seeing 3-D capable displays going out there, but we’re not seeing ATSC receivers with a mechanism for decoding a 3-D broadcast because that 3-D broadcast has not been designed yet.
And will it be backward compatible? Will I be able to watch programs broadcast in 3-D on the HD set I bought two years ago?
You won’t be able to watch in 3-D, but, yes, from the point of view of all the different distribution mechanisms, they have got to provide a backwards compatibility mechanism so you can watch it in 2-D.
When do you expect a standard to be published?
Late 2010 or early 2011 would be my guess. It’s one of these areas where we’ll be developing the standard alongside people who are developing the applications. The important thing is to get it right rather than get it early and wrong.
After that, how long will it take to get actual product into my home on some kind of medium?
A surprisingly short length of time, typically, because, first of all, the 3-D compatible displays are selling now. Cable companies and Blu-Ray are obviously going to be players. The satellite companies can actually deploy stuff remarkably quickly, particularly if they are participating in the standards development as it goes along. I am not a consumer electronics expert by any stretch of the imagination, but usually it surprises me how little time it takes.
What about the 3-D glasses? Do all the approaches being considered require glasses?
All of the reasonable displays we have available today need glasses. There have been a lot of what I regard as science experiments with autostereoscopic displays. We have got quite a number of papers on this at the conference [this week], but I certainly have never yet seen an autostereoscopic display that I would be prepared to sit down and watch.
What are the biggest obstacles to 3-D? What could derail this whole thing?
The lack of content is the thing that could slow it down, but I firmly believe that 3-D displays are going to get into a significant number of homes because of games, if nothing else. Once you have got the capability of displaying 3-D in the home, there are going to be people demanding other sorts of content in 3-D and there will be people out there who will provide it. We’ve gone far enough to know there are no technical blocks to doing it. There are greater and lesser challenges on different distribution mechanisms, but we know it can be done. I don’t think it’s derailable.
Let’s shift gears now. I understand that SMPTE is working on updating the synchronization and time labeling standards for video production. What are your goals for that? What are you trying to do there?
For once, I really think we got the timing exactly right. We have got standards for synchronization at the moment and we’ve got a standard for time labeling, which is the SMPTE time code. They’re both 30 years old. They work. They’re not totally broken, but they’re inappropriate.
So although we have got a system that works at the moment, people are forced to work around its limitations.
We think that we are going to come up with an extremely versatile system, which we’re hoping will not need its own dedicated infrastructure. We think it will be able to be distributed around the planet on control networks like ethernet or the Internet and provide more capabilities for multistandard operations, unambiguous audio timing and other things that the industry can adopt gradually if we get it right.
What’s your time frame for that work?
I’m guessing it’s going to be a couple of years to get the standards out on that, but I suspect that the equipment will appear virtually on the same day as the standard because all of the people likely to make equipment will be part of the development process. The fundamental decisions are going to be made fairly soon. So I think we’ll start to have availability in early 2012, something like that.
For the past few years, Japan’s NHK has been demonstrating 4,300-line ultrahigh-definition television. What’s happening with its standardization?
Oh, we’re already working on it. In fact we already have a number of standards in place. The big value of standards at the moment is that everybody working on it over the development period, which might be another 10 or 15 years, has at least got something to hang their hats on and don’t need to start inventing all the parameters from scratch.
One of your recent accomplishments is the BXF standard. What kind of impact is that going to have on station operations?
More than anything else it’s going to have a big efficiency and financial impact by tying together traffic systems and automation systems. It gets rid of an awful lot of the manual processes, which is cost effective in terms of the actual cost of doing it and very cost effective in terms of avoiding the errors. As-run logs, reconciling with financial sales records, invoicing, all sorts of things are made more efficient, more reliable by BXF.
We had up to 80 companies participating at one stage. It was just a real industrywide effort to solve a problem that most people had passed off as insoluble. It’s a real tremendous achievement.
Is there anything else you’re working on that broadcasters should know about?
Just a whole slew of things. One is audio-to-video synchronization, which has always been a minor problem and has tended to become a major problem in recent years. SMPTE only has a part of that. We’re working with ATSC and others to sort of say, how do we all put our standards in place to make sure this problem goes away?
We are also working on the aspect-ratio problem. A 16X9 receiver should be able to show 4X3 pictures without making people look fat, which is what tends to happen most of the time. SMPTE alone can’t solve it, but we’re got the core standards that will enable everybody else to build on and deal with the end-to-end solution.
Closed captions are another major issue at the moment. We are looking for a way of handling that in a way that makes sure the captions work all the way through the food chain.