AudioMasters
 
  User Info & Key Stats   
Welcome, Guest. Please login or register.

Login with username, password and session length
May 20, 2010, 01:25:39 AM
70515 Posts in 7368 Topics by 2192 Members
Latest Member: MeetPlanB
News:       Buy Adobe Audition:
+  AudioMasters
|-+  Audio Related
| |-+  General Audio
| | |-+  Sample Rate Question
  « previous next »
Pages: [1] Print
Author
Topic: Sample Rate Question  (Read 2141 times)
« on: October 02, 2008, 04:39:05 PM »
Randy Lahey Offline
Member
*****
Posts: 45



I have been a producer for about a year now, and just ran into this problem in the past week.  Actually twice in the past week.  I've noticed some spots I've downloaded have a 48k sample rate.  The bad part is I didn't notice until an announcer complained to me that a spot was cutting off before it was done.    As I've said I've noticed it twice, and both times with big national companies.  I'm just wondering why they wouldn't be using 44.1.  I thought that was the industry standard.

It's not a huge deal for me.  I just have to pay more attention, but it does add an extra step for me, and every other radio producer for that matter.  now, if I find a spot at 48k, I either have to go into Audition to re-sample it, or dub it in real-time from one computer to another.  Extra steps that I feel are unnecessary.

So I guess my question is...is there a valid reason for them to be sampling at 48k?


...and I guess another question for producers using Scott Studios while I'm at it.  Is there a setting where TLC could just change the sample rate automatically.  Or even just accept 44.1.  I don't know why it would let you use another sample rate as its obviously going to cause problems.

Thanks again.

Love, the board.  I'd be nothing without you guys.

Randy. 
Logged
Reply #1
« on: October 02, 2008, 05:18:34 PM »
Graeme Offline
Administrator
Member
*****
Posts: 2230

WWW

I'm just wondering why they wouldn't be using 44.1.  I thought that was the industry standard.

Not really.  The 'industry' had pretty much settled on 48KHz as a standard for the digital tape systems that were being developed (both studio systems and the domestic DAT machines).  The 44.1 KHz rate only came to the forefront following the introduction of the CD (for reasons related to video formats that were then standard). 

It seems that many studios are still using 48KHz, in spite of the fact that much of their output will eventually end up on CD anyway and require a sample rate conversion at some point.

I'm no radio man, so I can't answer your other question, but I'm sure there are plenty others here who can.
Logged

Reply #2
« on: October 02, 2008, 07:03:50 PM »
Wildduck Offline
Member
*****
Posts: 711



I may be a bit out of date, but my experience has been that most higher-end systems operate at 48kHz, partly because this matched the sample rate on digital bearers between studio centres, at least across Europe.
Lately I've encountered more systems running at 44.1, but these have been systems in smaller, more stand-alone stations. There used to be a few running at 32kHz, but I've not seen any recently.
When I worked with play-out systems we had to know parameters of target systems, and I wrote some file-transfer software that amongst other things normalised on-the-fly to the levels expected by the target system, so I think vigilance is the only sensible approach.

My understanding was that many of the modern radio broadcast systems only accept input material via the system database, and they system should carry out any file conversion at the input to the database process. Often, this can involve going through an additional a-to-d stage.

I should stress again that I no longer work actively with this type of system, so treat anything I say with caution.  rolleyes
Logged
Reply #3
« on: October 02, 2008, 08:11:47 PM »
Emmett Offline
Member
*****
Posts: 449

WWW

48k is the video standard.  Recieving files from a studio that specializes in video production often means the radio spot matches.  TV network spots for radio, for instance, often are delivered at 48k.  My cluster didn't allow us to rip files into Scott...Everything was dubbed in real time to make sure levels were consistent using RMS meters, rather than ripping and normalizing to a peak value.  So sample rates were never an issue, except in studios equipped with Yamaha 02R consoles that were locked to 44.1.

Emmett
Logged
Reply #4
« on: October 02, 2008, 09:20:07 PM »
oretez Offline
Member
*****
Posts: 647



don't speak radio so am  unaware of how prevalent or pervasive various 'standards' might be (but even cursory look at any broadcast history shows a field littered with candidates)

There never has been 'a' standard sampling rate.  A lot of the early work on digital representation of audio was done to improve multiplexing of telegraph signals and goes back to mid 1800's.  A lot of the subsequent work, at least based in US, occurred at the Bell labs (who, if I remember correctly was Nyquist's employer in the twenties when he added his contribution to formula that bears his name), and was explored to improve telephony.

Telegraph and telephone remained the installed economic base for much of the development of digital audio technology, with broadcast acquiring greater importance over time.  On a practical level this means that very few of the tools we use, for recording music, were pioneered with recording music in mind.  (admittedly not OP's area of interest.)

I'm not sure it is possible to arrive at a consensus as to 'why' specific sampling frequencies and bit rates were used.  Practical implementation, certainly including cost of available components, probably play at least as significant a roll as theory.  Anyway some standard sampling frequencies, with which I've worked: 8 kHz (telephone, intercoms, presentation wireless mics), 11 kHz (digital crossover for sound reinforcement),  22 kHz (MPEGs & and way more 78 to CD conversions then I care to think about), 32 kHz (consumer digital camcorders, at least one European radio network, presentation wireless mics),  Then there is a variation of 44 kHz (CD standard is 44.1) based on NTSC video @ 44056 . . . Which can be a much subtler issue to diagnose (then 48 vs. 44).  The 44.1 adopted by CD redbook, might have been derived from PAL video standard (samples times lines times frames per second). The first DAT (though as it was Denon not Sony I suppose should call it something else) used 47250 as a sampling frequency. Early pro digital recorders (soundstream, 3M) employed 50 kHz, while Mitsubishi's 'ProDigi' (X80 series) adopted 50400.  48 kHz was used by a lot of gear designed for film work.  It was adopted by DVD, with double that rate eventually settled on for the DVD-A 'standard'.

Even at this time all 'standards' are local not global, dependent on installed base of tech for what ever community with whom you are working.  'Standards' generally roll out as industries mature, from early 80's through the nineties technology was evolving rapidly enough that standardization might well have been counter productive. 

So?  Vigilance is the rule.  And it's far easier to determine what you have now then in the days of tape, where even if you knew what you had you still couldn't necessarily use it.
Logged
Pages: [1] Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Valid XHTML 1.0! Valid CSS! Ig-Oh Theme by koni.