AudioMasters
 
  User Info & Key Stats   
Welcome, Guest. Please login or register.

Login with username, password and session length
February 01, 2012, 03:17:03 PM
73736 Posts in 7768 Topics by 2596 Members
Latest Member: paulvincent
News:       Buy Adobe Audition:
+  AudioMasters
|-+  Audio Related
| |-+  Hardware and Soundcards
| | |-+  Hardware and Soundcards Stickies and FAQ's
| | | |-+  CD Writing
  « previous next »
Pages: [1] Print
Author
Locked Topic Topic: CD Writing  (Read 3523 times)
« on: April 09, 2009, 07:49:17 PM »
The FAQ Wizard Offline
Administrator
Member
*****
Posts: 29



From a post by SteveG on 15th June 2001:

There have been several recent debates in this forum about the merits or otherwise of different lengths of CD, and different types of CD writing, sometimes misleadingly called ‘burning’. For the benefit of those people not in possession of some of the basic information, I thought I’d fill in one or two of the gaps in an attempt to clarify one or two things.

The original CD audio standard was the famous ‘red book’ standard published jointly by Sony and Philips back in the mists of time. The important thing to note about it, is that it contains the basic standards to which audio must adhere in order to be recorded at all, and is the fundamental standard on which all the others are based. It specifies details about the Table of Contents(TOC), and the coding and interleave requirements for robust digital recording, and the notorious 74 min. limit. It even allows for 20Mbytes of other digital data to be recorded, but I’ll ignore that here. And it says Disk At Once (DAO) writing, but it would - it was the spec for mass-produced commercial CDs.

The interleave stuff is quite significant, though, especially in the context of ‘clicks’. Audio data is arranged into interleaving ‘frames’ when it’s written to a CD. This interleaved data from the CD is decoded in firmware when the datastream is reconstructed, and as not all firmware is identical, it rather follows that some players are going to be more susceptible to problems caused by incorrect frame boundary data than others. Most decent CD writing software fills the last frame with trailing zeros, ensuring a correct boundary, but it ain’t always so. It follows that even using DAO with weird numbers in frames can produce clicks, especially if those weird numbers are CE/Audition cue data, not audio.

But if we continue, the confusion starts to set in. I want to make something very clear from the outset. The CD writer in your PC writes CD-R disks. The standard for these is contained within the Orange book standard, Pt.ll. This standard was added in 1988, and basically allowed multi-media and multi-session CD’s, which by implication means TAO (Track At Once). This is the basis on which EasyCD creator, Nero, etc, etc. include the option to do it - it’s in the standard.

Now, before anyone gets too excited, let me remind you of what I said earlier - the red book standard is the basic audio one, and has not been superseded, as far as I’m aware. So if you are going to make an audio CD that has any chance of passing muster as the basis of a glass master, ideally it should conform to the rather conservative red book spec if you want to guarantee that it will play on all non-faulty players. The reason that the red book length spec. hasn’t been increased is that there are still a lot of dodgy players around, and not all of them are portables. A lot of the problems stem from the inability of their laser sled servos to track accurately beyond the 74 minute boundary - I have one that consistently won’t play anything beyond 75 mins. If you want to do audio on an Orange Book CD writer, you still have to abide by the red book standard, or run some risks, and here’s another one:

CD-Rs reflect significantly less laser energy back from the disk than commercially duplicated audio CDs. The very best ones can manage 75-80%, I believe, but CD-RWs are much worse, with figures around 20%! The immediate result of this is that the servo has even more trouble coping, as it relies on the reflected beam to track the spiral, and the actual pit reading is an extremely hit and miss affair. So although it’s easily possible to burn 80mins onto a CD, and even Philips have done it, (naughty, naughty) you should be aware that there is a significant difference between a highly reflective commercial CD and your CD-R. Combine all of these factors… well, I won’t do 80mins on a CD-R, anyway.

The real difficulty, though, is that the Orange book standard was changed to allow for 80min disks in 1996. Here’s a quote from TDK:

   
Quote
The first manufacturer to offer extended-capacity CD-R discs, TDK has been supplying recording studios with 80-minute CD-R discs for music mastering
    applications since 1996. The company's decision to bring its
    extended-capacity technology to the wider business and consumer markets follows on the publication earlier this spring of updated industry standards (Orange Book Part II) for80-minute discs.
     

    The new CDR-80 (multimedia) and CD-TWIN-R80 (home music) discs differ chiefly from 74-minute discs in the pitch (fineness) of the data track inscribed on the disc. Since data recorded on a CD-R follows a single, spiral track from the center to the perimeter of the disc, a finer track pitch allows more data to be written on the disc. The finer track pitch, however, may make the new 80-minute discs incompatible with some recorders and players. In addition, because some computer recording software does not support 80-minute / 700 MB CD-R, users may have to update their software accordingly or contact the software manufacturer to establish compatibility.

    - TDK

But it’s not even that simple - check out (for a start) http://www.mscience.com/faq57.html
Logged
Reply #1
« on: April 09, 2009, 07:49:43 PM »
The FAQ Wizard Offline
Administrator
Member
*****
Posts: 29



Seanbaker asked "To insure a good glass master in the worst situation..... I'd be safe using 74 minute CD-Rs and write in DAO mode? Anything else?

SteveG replied, in part,

If you use the right CD-writing software, then those are probably the most important items, yes. But you should be aware of one or two other problems. Firstly, it's important to make sure that the post-data gap is correct. This has to be 2 secs (150 sectors)long, with null data in it, and all '1's in the subcode 'P' channel, although most software will do this automatically.

The thing that can stuff you, though, is checking your disk for errors on a CD-ROM that handles CIRC by automatically correcting the 'E32-unreadable' errors. It is very easy to miss things this way which will make it through to the glass master and be reproduced on all of your production CDs, so it's always worth checking all of your stuff on a crappy, old, audio CD player before you send it off to be mass-produced.

There are all sorts of other requirements as well, but these are common to all written CDs, so they won't be an issue. It's worth remembering to check things like the SCMS status, though, if you don't want mass digital duplication of your stuff to be immediately possible. Personally, I think that this is so easily overcome anyway that it's hardly worth having.

Most mastering houses have fixed their LBRs (Laser Beam Recorders), that are actually used to make the glass master, so that they won't abort for every little thing. But this usually involves an extensive checking process, and it's always better to have avoided the mistakes in the first place.
Logged
Reply #2
« on: April 09, 2009, 07:50:09 PM »
The FAQ Wizard Offline
Administrator
Member
*****
Posts: 29



continuing the extract from SteveG's post of 15th June 2001:

First, all the data is multiplexed into one stream by alternating all the 16-bit data words. By definition, the first word is a ‘left channel’ word - the rest follows. Then the data is interleaved by doing a serial to 8-bit stream parallel conversion, and feeding each bit stream through its own different delay, and re-serialising.

You really need pictures for an easy explanation of how it's actually done. What is more useful is knowing why it's done. It's all to do with not putting all of your eggs in one basket. If your audio is broken up into little bits and spread around on the disk, damage, which only usually occurs in one place stands a good chance of being repairable by the cross-interleaved Reed-Solomon error correcting mechanisms. By using this 'temporal spreading', which becomes 'spatial spreading' on the disk, in conjunction with the error correction means that quite large holes in the data can be patched around in the replay system.

Now an extra 8-bit word, the sub-code data, is inserted before each block of audio. The control words enable the data to be ‘rebuilt’ in the player. The data field can later be rebuilt 98 control words at a time. The control word is where things like SCMS data are stored, but that’s by the way. Next comes a process called 8 to 14 modulation. This is where the 16-bit data words are changed into two 8-bit data words, and each one is changed into a 14bit ‘symbol’. The term ‘symbol’ is used to differentiate between 16-bit and 14-bit words. A ‘symbol’ is really a 14-bit word, but that’s confusing… Lastly, 3 ‘coupling’ bits are written between each symbol. Then the error correction code is added. The last thing written is a unique 24-bit sync word which identifies the start point for the frame.

So, a data frame has a sync word, a control word, 12 symbols of data, 4 symbols of error correction, 12 more symbols of data and four more symbols of error correction. That takes up 588 bits, as there are 3-bit gaps between each section.

All of this is done because of the problems of writing pit data onto a CD. Pits come in 9 different sizes, varying from 0.833 micrometres to 3.56 micrometres long, and 0.5 micrometres wide. - these are the outside tolerances. On a normal CD there is a track spacing of 1.6 micrometres nominally.

So there are these 9 different pit sizes written onto the disk, and obviously the pit rate will relate to the 4.3218 Mhz clock for the data rate. Since the clock cannot be directly recorded onto the CD, it has to be extracted from elsewhere to reform the data. The pit lengths were actually chosen so that when the CD is spun at a constant linear velocity, the combined effect of reading them continuously will be to generate sub-harmonics of the recording clock rate, which can then be reconstituted into the 'proper' clock by a Phase-Locked-Loop (PLL) system.

It's actually a very clever system, because it can also take account of small speed variations, and average out the spindle drive speed to provide an overall correct replay rate.

But the conclusions are as follows: Most of the guff spoken by 'Audiophiles' about audio on discs is based pretty much on total ignorance. You can either reconstruct this totally mangled mess, or you can’t. But it was done this way because, believe it or not, you pretty much can. After the re-clocking and reprocessing the data back into a stream, the D/A converters and what comes immediately afterwards are the only things that will significantly affect the audio quality. The reconstructed clock is crystal-controlled, which might just affect the output timing slightly, I suppose. The error correction has three main checking methods, parity, interleave error checking, and linear interpolation. If there is agreement, you will get the output (this is a bit of an over-simplification, but it will do for now).
Logged
Reply #3
« on: April 09, 2009, 07:50:28 PM »
The FAQ Wizard Offline
Administrator
Member
*****
Posts: 29



In answer to the question "Does the red book standard say anything about the track pitch? Are 80-minute discs not red-book compliant regardless of how much is recorded on them?", posted by Younglove, SteveG replied (in part),

The Red book actually mentions two track pitch standards, one for 63min disks and one for 74min ones. There is some conjecture about how 74mins was actually arrived at - the best story is about some Beethoven Symphony or whatever...

The Orange book standard, now revised, says that you can legally store 700MBytes of data on a disk. Not only that, but the ATIP at the beginning of the disk, which is precoded with the spiral length, will confirm this to your writer. Since Orange book disks can legally contain audio, this has been interpreted by many as meaning that you can legitimately store 80mins of audio on it, and I can't actually prove that that's wrong. But you have no right to expect this to play on an audio-only CD player, for two reasons:

Firstly, the reflectivity of your CD-R may not be high enough. And the pitch dimensions of your disk do not conform to either of the Red book standards! So in a strict interpretation, then NO CD-R with an 80 minute blank in it can ever make a Red book CD that fully conforms to the standard. You should use 74min blanks, and pray about the reflectivity.

You may have noticed that most fast CD-ROM drives appear to read audio in a 'burst' mode. They actually read a lot of data into a buffer, then read it out slowly, and only increment the drive when they need to. This is why, if you can actually read all of the data off a disk, the transport isn't as critical as the audiophiles would have you believe.
Logged
Reply #4
« on: April 09, 2009, 07:51:28 PM »
The FAQ Wizard Offline
Administrator
Member
*****
Posts: 29



Another extract from a post by SteveG on 16 June 2001:

Are all CD rippers equal? Do they all work as well with different drives?

One of the problems with the original Red book specification is that nowhere does it specify that addressing has to be block-accurate. So what happens with most CD drives is that the data from the CD is fed into a FIFO buffer, but the data’s block address is handled separately - it’s pulled out of the sub-code handler part, and shoved off to a different part of the drive’s circuitry at the same time that the block is read.

This is all well and fine for one block, but the problem comes when the drive has to get the next block. If the drive has stopped in order to wait for the Hard Disk to store the extracted data, the chances of it hitting exactly the right block when it starts again for the next read are diminished somewhat. It’s usually a pretty small error, but it can, in bad cases, sound like tiny stutters on the extracted audio, or omitted bits. The best fixes for this are actually in software, but it’s worth noting that most Plextor drives have actually addressed the hardware problem, and tied the block addresses to the relevant audio. The software fix works by deliberately over-reading and overlapping the audio, and sliding it all around afterwards until it’s in the correct place, then writing the file.

This isn’t actually a problem with data, because the CD-ROM spec actually has a 12-byte sync word, and it’s possible to extract the address from the FIFO with the data in. But the Red book spec was never intended to allow this to happen, so the data structure was designed to be fine for streamed audio, but not block extraction, which is what a fast CD-ROM drive wants to do. It’s usually safe to assume that if your fast CD-ROM drive will play audio correctly through your speakers, and supports ripping, then it will be alright. But you should be aware that there are some rogues around - most of them being older drives. I have a drive that makes a complete pig’s breakfast of the process with simple rippers - the drive has appalling characteristics anyway.

It’s the fact that audio blocks were never intended to be data blocks that makes ripping a slow process on some drives - now I hope you understand why.

If you stretch out the spiral on a CD to its full length, it’s over 4 miles long…

Incidentally, I suppose that I ought to add the following to the eight-to-fourteen modulation (EFM) bit I did earlier, so that people can get really confused:

Why does it need to be EFM? In order for the clock sub-harmonic signal to be reliably extracted from the pit data, and keep the PLL locked, the pit patterns have to be arranged so that they meet the following criteria: 1) no two ‘1’s are consecutive, 2) a minimum of 2 ‘0’s exist between two ‘1’s, and 3) a maximum of ten ‘0’s exist between two ‘1’s. In a 14-bit code, only 267 combinations satisfy these requirements, and 256 of those are in a look-up table, so that each 8-bit combination (256 bits) can be paired with a 14-bit code example and that’s what’s recorded on the CD. When the 14-bit code is replayed, then the correct 8-bit sequence is looked up in the CD player’s firmware and substituted for it. The two 8-bit sequences are then strung back together into a 16-bit sequence. So your data’s probably even more mangled than you thought…
Logged
Reply #5
« on: April 09, 2009, 07:51:57 PM »
The FAQ Wizard Offline
Administrator
Member
*****
Posts: 29



From a post by SteveG on 16th June 2001:

Did you know that every time you put a new CD-R blank in your writer, the first thing it does is write on the disk?

Now I know that sounds odd, but it’s not actually at all. We already know that not all blanks are equal - in fact they all vary quite a bit, and so do different writers. So in order to set the right amount of power for the laser to use to change the dye states to the levels specified in the Orange book standard, the drive carries out a process called Optimum Power Calibration (OPC). There is, in the Lead-in on the disk an area where the ATIP (Absolute Time In Pregroove) length information and a write power suggestion, RORP (Recommended Optimum Recorded Power) is written, so the drive can gauge the reflectivity of the disk, and then have a stab at guessing how much power will be needed to write to the disk without splatting the dye so much that it runs into the next turn on the spiral, or writing totally inadequate pits.

It tests its guess in a reserved area before the lead-in called the power calibration area by writing a series of test pits, and then measuring the land area between them. When it finds the optimum one, it looks back at the power setting for that area, and that’s what it uses.

Of course, the power requirement varies with different writing speeds, and the power check has to take account of this. But on any drive that carries out this test competently, it should mean that recording should be optimised for all available speeds. Of course, some blanks are more consistently coated than others, and this can lead to problems. Some drive manufacturers now claim to be able to carry out this whole process during data recording, and adapt the laser power accordingly. May be this is why some drives perform better with cheap blanks than others?

There was, on early recorders, the option to just to use the RORP value encoded in the ATIP, and do no check, but this practice was abandoned ages ago because the results were totally unsatisfactory. All blanks are different, and there is a surprisingly wide tolerance in the laser wavelength allowed, which can have a significant impact on the laser power required. So the RORP-only writer is a bit of an anachronism, and if you have one, it’s definitely time to move on…

So, at normal x1 speed, the laser might test between 4.1 and 7.7 mW for a RORP value of 5.9mW, and use correspondingly higher powers for greater write speeds.
Logged
Pages: [1] Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Valid XHTML 1.0! Valid CSS! Ig-Oh Theme by koni.