Turn off the Ad Banner  

To print: Select File and then Print from your browser's menu.

    -----------------------------------------------
This story was printed from CdrInfo.com,
located at http://www.cdrinfo.com.
-----------------------------------------------


Appeared on: Thursday, January 08, 2004
Media Quality Tests


1. Quick Introduction

A Guide to the Media Quality Tests at CdrInfo.com

Beta Version

1. A Quick Introduction

Welcome to the new section of the CdrInfo.com devoted to on-line comparisons of the quality of various media (CD and DVD recordable) according to the drive on which they were recorded (and several other factors). We are proud to present to our readership another innovative section, offering features unique along the entire net. We would be glad if it will be proved helpful to all of you.

In the following picture you see the one-screen display of the user interface for viewing the media quality tests. This is a single ?page? where the user can locate a particular test according to various selection criteria he defines.

Click for large view

At this time the reader can access this page here. The user is first asked to login by using his e-mail address and a password. If he is not a member of our site (this is different than being member of our forum), he can become a member by clicking here.

Once the new member has joined our site, he can go to his account here and provide the appropriate information. He must then verify the participation in the "Media Quality Tests" area, by clicking the confirmation box here.

He will then be able to view the complete suit of already available Tests.

Each reader can select up to 5 different comparison cases and see immediately the on-screen differences of the tested media. Each curve at the right-bottom, represents the number of reading errors per time. The lower the numbers in each displayed curve, the better the results are.

So, everyone can now easily locate (among too many other things) which media brand is the best appropriate for his own drive. Each one among these 5 (at most) selections depends on several factors. These are listed from top to bottom at the left of the above picture, and in greater detail below.

In this picture you see listed, in order of necessary selection (top-to-bottom) each list-box and its associated definition. You can read detailed instructions on the functionality of each one of these controls (list-boxes) of this page here. <link to ?View Page description? headline>.

In order to be able to view correctly these tests, you will have to adjust appropriately the security settings of your browser. A quick guidance is provided here.

The technical details on the theoretical basis that these tests are based on can be viewed here. <link to ?Measuring the quality of recorded media? headline>

An explanation on the programming approach we chose, for the more technically oriented reader can be found here. <link to ?Programming decisions for developing the Media Quality Tests? headline>

2. View Page Description

View Page description

Please pay attention in the picture below.

The following are the steps you take during the selection of each test result for viewing it or comparing it with other sesults.

1. First the user has to decide on viewing tests concerning CD?s or DVD?s. By clicking on the relevant toolbar at the top he can change this option. The default is CD tests:

2. Next he will have to decide on what type of tests he is interested in viewing. Depending on the case of a CD or DVD in step 1 above, he is offered a different list of options.

We note that decisions 1 and 2 are the first ones. Changing them subsequently will result to a complete change of everything else the user has selected so far. So please be careful at your choices here.

The next steps 3-13 are to be taken in this exactly order when making the first selection for a test. Subsequently the user can fix some of his choices to be able to add similar tests faster, provided there are available tests in our database.

Please wait a few seconds while the next list-box is being populated with the data that are relevant with your previous selection you have made. A round-trip to our server is required . Under a DSL or faster connection this time will be negligible, provided the net out there is under normal traffic. Under ISDN or PSTN a lot depends on the ping time to our site from your particular location. In general, ping numbers under 250-300 will provide a very acceptable postback response and a pleasant user experience.

3. Please now select the manufacturer of the drive for which you want to find relevant tests. This is the maker of the recording drive.

You have to select this here, in order to be able to keep to an acceptable length the range of the number of options available on the other list-boxes below. In the case of an OEM drive with a different brand, please choose the latter. Batches of drives, even from the same OEM, usually are quality differentiated, either towards the top or the bottom, depending on the actual brand name that appears on the box of the drive you bought and his agreement with the OEM.

4. Next select the drive model.

5. Then the firmware of the drive.

Isolating steps 3, 4 and 5 by checking the "Fix it" box to the right of each selection, gives you the option of reducing the number of choices you have to define during each drive/media combination. Once, for example, you have selected the first test, you can fix all the selected options up to the firmware version if you like, provided there are other tests in our database to subsequently fill in your decisions:

6. In this step, similarly, choose the Disc manufacturer. This is the name of the brand as it appears on the disc or it?s covers. In case there is no such info on both places, then please use the name of the disc manufacturer as it appears by the media Identity finding application (DvdInfo, CdSpeed, ...)

7. Then select the Disc model. This is displayed on the disc itself or in the covers of the disc or in both places, provided you did not buy the disc in bulk without any identifying mark. In the latter case choose one of the _Unbranded available options.

8. Here choose the Disc Identity as comprised by the ?Manufacturer? and ?msf? readings, displayed by the Disc Identification Utility. You can see how other testers have used this selection in order to not double-post the same mark.

9. Next, choose the Testing application. For the time being and for reasons explained here<link> , there are only 3 applications available. KProbe, UmDoctor and CdrInfo. The latter is used only by our team for publishing our own media quality tests based on professional Yamaha equipment. The former 2 programs are acceptable for any tester to use and submit test.

10. Afterwards, please choose the reader drive. This is based on the choice you made previously, as it must be both compatible with the reading test application (step 9) and at least one Test should be available for viewing.

11. In this step you choose the recording speed. So, you can find out how different speeds chosen during recording affect the quality of the measured results.

12. Similarly, you must choose the testing speed. In some cases this is not adjustable during testing and is thus equal to the maximum one, in other cases clever use of speed ?reduction software might adjust it accordingly and hence being able to offer distinguishable test results.

13. Next select the particular test you want to see.

From this respect our page is very ?democratic:) Each one of our site members can submit his own test. But only once, although he is able to subsequently edit it or replace it with a newer or (presumably) more accurate measurement. (This applies to the same combination of drive, media, and all other settings)

On top of these options a user can also choose each one of the following cases as well: ?Best?, ?Worst? and ?Average?. The first refers to the test with the fewest errors. The second refers to the test with the most errors. The ?Average? case resembles to the one closest to the mathematical average of all submitted tests. For more information, please read the answer to the relevant FAQ question.

14. As soon as you have made your choices above you are ready to formalize your selection by clicking the button ?Add? on the topmost of the right part of the page.

Click for large view

You can then either Press ?View? and immediately see the chosen graphic of a test, or continue your selections on the left for adding up to 5 different disc/drive/etc combinations.

Click for large image

You can not add 2 times the same test, unless it is has been submitted by a different user (member) or refers to the aggregates ?Best?, ?Worst? or ?Average?.

You can thus repeat steps 3-13 above for making more selections.

You can also remove some of them at any time. The system is clever enough to keep the state of the page intact, as long as you do not press for a different media or test type (steps 1, 2).

Click fo rlarge image

In this case you must start all over from the beginning. This might be tedious if you have already made several selections. So, please pay particular attention on this.


3. Submission of Tests

2. Helping you understand how submission of Tests works

Each user can also submit his own tests. He will have to become a CdrInfo.com member first, however. This is easy. All required is a valid e-mail address and following the steps he is instructed to follow when going here, as explained in the previous section.

Below is the picture of the main page you see when having logged on as "Tester" in the section of "Media Quality Tests" of our site.

You must click the mouse on the link ?submit? on the left top of the page and you are immediately redirected to the page where you can submit your first test result, as shown below.

 

Click for large view

The structure of this page (Submit Page) is similar in concept to the initial page for viewing the tests themselves (View Page). You must have a fairly good understanding of how this section works before being able to successfully submit any tests. So, we suggest you spend some time acquainting yourself with how you can choose and view the already available tests, before going on into submitting your own test results. We further urge you to read carefully this whole article and have a good understanding of the various what's and why's, well in advance.

At the left toolbar on the left of this page there are is also the option of seeing all tests submitted by you. You can also edit them!. Here is the relevant picture concerning this "Edit Page".

 

For details and explanation of the steps you will have to follow for submitting a test, and the rationale behind this, click here. <link to: “Submit Test Page description” headline>

For details on the screen containing the full list of tests contacted by you please go here.<link to “Page of List tests conducted by a particular reader” headline>


4. Submit Tests detailled description

"Submit Page" description

Please first read the previous section and have yourself acquainted enough with the "View Page" of the tests. Having noticed some finer details there, will most probably enable you to submit your own tests with less effort and fewer errors.

First please review the security settings of your browser as explained here.

We assume that you have some knowledge of the test reading application like KProbe and UmDoctor and how they behave. More details for each candidate reading application can be found in the appendices of this review.

Below is a complete picture of the page for submitting your own tests.

Click for large image

We should say a few things about the top and left toolbars above. On the top toolbar the user can change what he can edit by changing from Member to Tester. As a Member he can define and edit the various options concerning his account.

 

As a media Tester he is able to view the submitted tests, post new tests and edit previously posted tests.

In each case he can choose from the toolbar on the left the available pages he can access. At every time he can also locate his position in the web site by reading the labels on the "yellow" selections.

Next, we identify in the following picture the steps you will have to take in order to submit your first successful test.

First you will have to choose the parameters of the test as in steps 1-12 below, then you will have to transform and submit the actual result file. Please read on.

Choosing the Parameters

Most of the following steps 1-12 can be followed in parallel, with a few notable exceptions. (Steps 1, 2 and 9, 10) below. In these 2 cases (4 steps) he must follow step 1 before 2 and step 9 before 10. Either way, the user can follow the steps as exactly presented below and be sure he has made a valid initial selection.

In order to submit a new test, first you will have to tell the system the parameters of the test you have done.

Steps

1. Please choose the media type.

2. According to you previous selection in step 1, you will be presented with a drop down list of available Types of tests. Choose the one suitable for your case. Please note that step has to be taken first. Otherwise, that is if you first choose the Type of the test, changing the media type will result to changing the test Type selection.

This behavior is by design. According to the disc there is a different list of possible tests. In general most of the time you will have to deal with one of the following 2 cases: CD-R/C1-C2 and DVD-R/PI-PO.

3. In this step select the Manufacturer of the recording drive.

4. Choose the recording drive model.

5. Choose the firmware of the recorder.

In steps 3-5 you should take into account the same issues for the respective steps in the case of the "View Page".

6. In this step select the Manufacturer of the recording disc.

7. Choose the disc model.

8. Choose the Identifier of the disc.

As before, please consult the relevant steps in the "View Page" description.

9. In this step choose one of the following 3 non-grayed-out radio buttons. In the case of the normal user you will have to further restrict your available options to 2. The first is KProbe, the other UmDoctor. We explain in a following section why we are forced in these 2 only selections for the time being.

10. According to your previous selection the right drop-down-list is populated by the group of available testing drives. This is different because each of the 2 above programs works only with a particular manufacturer chip-set.

11. In this step, as well as in the following:

12. you can choose the recorded and tested speed of the disc whose test you are about to post.

Having completed the above steps, you are ready to first check out if this test is already submitted by you or not. We have added this precautionary action in order to help you eliminate possible failure cases.

13. Please press the "Check test Availability" button and you will be presented with one of the following 2 pop-up windows.

Pop-Up 1
Pop-Up 2

In the first case, you are free to proceed into actually posting the results, in the other case you will have to check for missing parameters, or you will have to change at least one of them in order to define a non-already posted test.

Transforming and posting the file

You can do this in 5 easy steps. (These are highlighted from left-to-right in the following picture.)

You should be familiar with the security relaxation requirements as specifically explained in the relevant section. Failing to follow these guidelines, you will be unable to submit any tests.

1. Press the "Browse" button in order to locate in your local PC filesystem the .CSV file containing the results of the test whose parameters you have chose above.

2. Press "Load original csv file" in order to load in memory the file.

3. Press "Shrink Csv File" in order to have the file shrinked to about 1KB. This is necessary in order to be able to include the test results in our database.

4. Press the button "Check Csv validity". This seems to be an almost redundant step. We have included it for those willing to manually populate the right panel with the test results (not recommended.) We have included code for checking most errors. But still this is not fale-safe. You must follow the whole procedure for an original Csv file in order to be sure you are submitting the correct result.

Please note that it might be the case of KProbe or UmDoctor bug that might result to illegitimate Csv files. In this case please contact us.

5. Last, press the "Submit Test" button in order to post the test.


5. List of Tests conducted by a particular reader

The List of Tests conducted by a particular reader

As a member of our site you can not only submit new tests, but as easily make correction to those tests already submitted by you. You can find out which are the tests you have already submitted by clicking on the relevant link at the toolbar at the left of the pages viewable once you logon as Tester. Please see the picture below or this.

TO go to this page just click the button "Submitted Tests" on the left.

Once done so, you are presented with a list of the tests you have done so far. The list includes both those tests already accepted for public viewing, as well as those awaiting acceptance by our check team.

Click for large image

The list is completely navigatable and sortable. Just click on the header(s) of the table, on whatever field(s) you like and you will immediately have the list sorted on in the requested order. You can click twice for reverse ordering, just as in a typical windows application. You can also order your tests by sorting on several fields in either ascending or reverse order by clicking on a number of fields repeatedly in a respective order!

Below is a picture of the list (table) sorted in ascending order with respect to the supplied drive manufacturer.

On the top of the table there is the ?filter section?. Please see the picture below.

By arranging the list and radio boxes appropriately, you can filter into only those tests you are interested in. This is particularly convenient if you have a lot of tests submitted or if you are among the members of our team for ?checking-in? (that is accepting) member submitted tests.

Please take into account that the initial state of the Filter is to allow all tests. The same effect is achieved by refreshing the page when typing the page url again and pressing enter.

Please note, that when more than a few dozen tests exist you can easily navigate by clicking the relevant page numbers at the bottom of the list (table).

We have also added here the feature of being able to quickly preview a particular test from within this page itself.


6. Empty

Empty Section of this Review

To be added during revision.

 


7. An example of viewing a test

An example of viewing a test

We provide below a simple example of how to see a particular test. We start by showing you how to become a Member of our community. Next we explain how to become a Tester and be able to both view and submit your own tests.

In this section we will show you how to view tests. The next section is devoted on how to submit a test.

From CdrInfo website select the "Media Tests" option

You will be taken in the appropriate section, where you will be able to become a member. Press the link as illustrated below,

This is what is pointed to by the arrow in the following photo.

The link will lead you to the following screen where you should give an email address, password and a name.

As soon as you press the "Submit" button you will see the screen below. At the same time you will receive an email message. It is important to read this message for being able to continue the registration procedure.

From your email account you will be able to read the following message, which provides you with information for how to proceed.

After pressing the link in your email message you will be transferred here:

Press Submit. The following message informs you that your verification has been completed.

From the Members login page I can now easily login using the name and password I gave during the verification process. As soon as I login I will see the following screen.

Please pay attention in the picture below.

The following are the steps I have to take during the selection of a particular test result for viewing it or comparing it with other results.

My case concerns an Aopen drive. So...

1. First I set to CD-R. By clicking on the relevant toolbar at the top I change this option. The default is CD tests:

2. Next I have to define what type of tests I want to see. I choose C1/C2 test type.

Please wait a few seconds while the next list-box is being populated with the data that are relevant with your previous selection you have made.

3. Now I select the manufacturer which in our example is Aopen.

4. After the drive's model, CRW5232.

5. Then the firmware of the drive, v1.03.

6. In this step, similarly, I choose the Disc manufacturer (this is the name of the brand as it appears on the disc or it?s covers). In case there is no such info on both places, then please use the name of the disc manufacturer as it appears by the media Identity finding application (DvdInfo, CdSpeed, ...). For continue this example I set to Taiyo Yuden.

7. Next I set the Disc model. This is displayed on the disc itself or in the covers of the disc or in both places, provided you did not buy the disc in bulk without any identifying mark. Thats why in our case I select CD-R Unbranded Silver.

8. Right after I choose the Disc Identity as comprised by the ?Manufacturer? and ?msf? readings, displayed by the Disc Identification Utility. In our case Taiyo Yuden 97m24s01f.

9. Next, I choose the Testing application. For the time being and for reasons explained HERE, there are only 3 applications available. KProbe, UmDoctor and CdrInfo. I choose the UmDoctor option. If you have questions of how to use UmDoctor press HERE.

10. Afterwards, I choose the reader drive. This is based on the choice you made previously, as it must be both compatible with the reading test application (step 9) and at least one Test should be available for viewing. UmDoctor is compatible only with DVD recorders with Sanyo chipset which supports HD-Burn. Those are Optorite DD0201, DD0203, DD0401 and the DD0405. So I choose the Optorite DD0203 drive.

11. Then I choose the recording speed, 40x.

12. Similarly, I choose the testing speed, again 40x.

13. Next I select the particular test I want to see. You can choose between the Best, Worst and Average results or the availiable user. In our case there is only one user, I, so I select "Tony Veglis ".

14. As soon as I have made my choices above I am ready to formalize my selection by clicking the button ?Add? on the topmost of the right part of the page.

Click for large view

I press ?View? and immediately see the chosen graphic of a test, or I can continue my selections on the left for adding up to 5 different disc/drive/etc combinations.

Click for large image

I can not add 2 times the same test, unless it is has been submitted by a different user (member) or refers to the aggregates ?Best?, ?Worst? or ?Average?.

You can thus repeat steps 3-13 above for making more selections.

I can also remove some of them at any time. The system is clever enough to keep the state of the page intact, as long as I do not press for a different media or test type (steps 1, 2).

Click fo rlarge image

In this case you must start all over from the beginning. This might be tedious if you have already made several selections. So, please pay particular attention on this.

If you would like to be a tester the procedure is very simple. From your home page click on the link "joining here" as shown in the picture below, from the arrow.

You will be transferred in a page, like this one below, where all you have to do is to press the "Activate Membership" button.

Ending you will be informed that the joining procedure is done and you have joined Media Tests as a Tester. That all!


8. An example of submitting a test

An example of submitting a test

We assume you have read the previous section on how to become a Member and Tester of our community and on how to view a particular test.

In this section we will show you how to actually submit a test. We suggest you also read all other sections of our review in order to grasp all aspects of the required procedures and have all your questions been answered.

Once you have logged on to the CdrInfo.com Member home-page, one of the available options you have is to submit Media Quality Tests.

To do this choose the tester role from the MediaTests menu

When the new page loads you will have been direct to the MediaTests page. By using the navigation menu on the left, you are able to view the submitted tests, post new tests and edit previously posted tests .

To submit a new test just press the Submit Test button and you will be directed to the submit test page

For this test I am going to submit the test results of a Taiyo Yuden (TY) CD-R, written with a Plextor PX-W1610A. The Program used for the quality test is the UMDoctor Pro II and the reader is the Optorite DD0401 drive. The test type is C1/C2.

In order to proceed to the submit test page i have already saved the required .dat files from UmDoctor as shown here

On the top left of the submit area, there is the media type selection area. I choose CD-R for media type because I want to test a CD-R

Below this area, the test type listbox is populated by options accordingly. I choose C1/C2 as the test type

On the next column to the right, the Recorder Drive column I will select the specifications of my writer. As soon as I choose Plextor for manufacturer, the next selection loads and I choose the Model and finally the Firmware of my Recorder, as seen below.

Next to the Recorder Drive column, there is the Disc Column. Here I will select the specifications of the written disc.

I select Taiyo Yuden as Manufacturer, CD-R Unbranded silver for Model because there is no brand on the surface of my disc and finally the ID of my disc .To find out the ID of my disc I have already used the DVDinfo software available here.

Please notice that on the right corner of both the Recorder Drive and Disc column there is an Other button, marked on the above photo, that will direct you to the Data Inclusion Request page. You can use this button to request the inclusion of a new drive or a new disc. The fields that must be filled are shown below. Take a notice that is separated in two categories. The drive request and the disc request.

In the following step I will choose the program that I used to test (scan) the recorded CD-R. UmDoctor Pro II is the program used for this demo.

According to my previous choice, I can only choose an Optorite drive from the listbox shown here.

This is the final step on the selection of the parameters for submitting a test result. On this menu I have choose the speed that the disc was recorded(16x) and the speed that the cd was tested.(24x)

The next step is very important. By pressing the button I will check if the parameters that i have entered up to this time are correct and if the this particular test has already been submitted by me or not.

The test availability window pops-up and informs me that I have made a Complete parameter selection and additionally this test has Not been submitted by me.

If the pop-up window looks like this

you will have to check for missing parameters, or you will have to change at least one of them in order to define a non-already posted by you test. (There is a requirement for only 1 test per parameter selection and per user.)

Up to this point I have chosen the settings of my test, and I checked my settings for a test availability. The next thing to do is to locate the .dat file on my pc. I click on the browse button to load the .dat file from my pc.

This contains the results of the test whose parameters I have defined above. Then I press the "Load Original Csv File" button.When the page loads you will see the following picture.

On the next step there is a "Shrink Csv File" button that will make my file shrinked to about 1KB in size. I press the "Shrink Csv File" button and the .dat file loads on a new panel to the right. Notice that the new file is now Comma Delimited.

Just like in the previous step there is a button to test if the loaded file is valid to submit or not. I press the Check .csv Validity button and a new window pop-up to inform me if my .Csv file is Valid or Invalid.

The following picture shows that I have made an invalid submission and informs me to correct the submission

On the following photo we can see the window that pop-up in case the submissions is valid

The final step in order to submit the test is to press the submit button.


9. Security constraints

Security constraints

In order to be able to view and submit properly your own tests you need to have your browser's settings adjusted accordingly. Please read below for each case (read tests/submit tests) respectively.

View Page (view the tests)

For viewing the page with the Tests, there is no other user obligation other than using a compatible browser (IE has been tested thoroughly on this aspect already) and having JavaScript (also so-called ?Active Scripting?) enabled.

In the following the reader can see pictures of how the security settings for the "Internet Zone" should be set.

There is no ActiveX or other plugin required. Everything transmitted is pure html and JavaScript.

Although everything is now based on the server-side to Asp.Net, our decision here was based only the resources and persons available to help us develop these pages. There is nothing special to .Net here, other than achieving easy user authentication, page state and some structuring of the underlying code. Let?s see how this frame will withstand users appetite for bandwidth:)

Submit Page (submit a test)

Here applies to a large extend what we have said above. With one notable exception. The user is required to use exclusively Internet Explorer as his browser and be more tolerant towards security. This is a technical requirement, because the format translation from each measuring application (C1/C2-PI/PO for the time being) has to be done on the submitter?s machine.

Since the time and resources required for this format transformation requires extensive resources both in time and memory, it would impractical to devote all this processing power on the side of the server.

We must make this clear to our users, so that to minimize the criticism from their part on the way we designed the submission page. Of course, we do accept this along any any other criticism. In fact we do encourage it. So, all this is said here for the purpose of saving some of time trying to clarify and answer identical or similar questions from the part of our users over and over again.

First, we must stretch the fact that some programs, like KProbe, in the case of DVD PI/PO measurement, produce text files of over 1 MB length! Posting them to our servers would be prohibitive anyway. I hope this argument by itself is enough for justifying the approach we adopted.

For all these reasons, a user will have to use IE 5.5 or higher for being able to submit tests. (IE 5.0 might suffice, but we have not tested this yet.) He will also have to enable ?Active Scripting?, and relax some other security settings. For this reason the best approach would be to set the address ?cdrinfo.com? at the list of trusted sites. Then he will have to verify that ?Initialize and Script active X controls not marked as safe for scripting?, ?Run ActiveX controls and plug-ins? and ?Access Data sources across domains? are all set as either "enable" or "prompt".

In the latter case the user will be prompted for each test submission about allowing or not the relevant action. Please take into account that the code used for this case is pure JavaScript and thus publicly available for viewing and evaluation by experts. So, we are completely certain that we respect our users. At least as they do respect our site themselves, along with our continuous efforts for always bringing them the best news and reviews on the market and the industry today.

The following pictures will help you adjust the security settings for the "Trusted Sites" zone under IE, after adding CdrInfo.com to the list of trusted sites.

First uncheck the requirement for Https, then add the cdrinfo.com site to the list of trusted sites.

Then, please adjust the settings as shown in the following pictures.

The following setting is the same as in the previous case.

The following settings affect only the submission of new user tests.

You might be able to find answers to some other questions on security in the FAQ section of this presentation.


10. Csv file format

Csv file format

This is the format of the test files. They are actually normal text files following a pre-specified pattern of numerals, text and punctuation.

Usually, they correspond to a simple table. This table in turn might be considered a text form of a typical database table.

A Csv file, consequently, is a series of rows (lines). In our case there are a few constrains this file should have to follow. On each line there are 3 numbers. The first corresponds to the particular time (in seconds) of each one of the following 2 measurements. The first measurement is a number representing the number of C1 errors (in the case of a tested CD) or PI errors (in the case of a tested DVD). The second number is the corresponding number of C2 or PO errors.

In some cases it is useful to have a header included in each Csv file containing an indication for each one of the following numbers, according to the column they belong.

In an original Csv file the non-integer part of time corresponds to it's normal fractional part. They do not correspond to sectors (1/75 of a second.). As a results, the error rates illustrated in the graphs of our database do not correspond to those absolute values measured by the used programs themselves. This is due to the fact of summing them up over the period of one minute for CD's and 8 minutes for DVD's. We do this for easy inclusion in our database. However, these results are completely comparable with each other, and server the intended purpose of comparing quality results from different media.

Non-integer numbers for the other two numbers per each line are due to the above mentioned summation process (rounded to 2 decimal places) performed on the original Csv file. Instead of having to show several thousand (perhaps) numbers for each test result into a single picture, something user unfriendly and database cumbersome, we only include measures averaged over 1 or 8 minutes.

Below, on the left, is a sample Csv file for the case of a test-result for a CD. On the right, the shown numbers are those appearing after shrinking the original Csv

Typical CD Csv file
Original .CSV
Shrinked .CSV
"DiscTime", "Sum1", "Sum2"  
69,6,0 1,35,0
157,1,0 2,43,0
239,0,0 3,202,0
321,0,0 4,25,0
403,0,0 5,35,0
482,1,0 6,36,0
561,0,0 7,11,0
------------------
-------------------
359003,2,0 74,28,0
359097,1,0 75,26,0
359184,1,0 76,53,0
359276,1,0 77,39,0
359363,0,0 78,44,0
359450,0,0 79,30,0
359542,0,0 80,22,0

A sample concerning the test results of DVD are shown below. To the right is shown the topmost lines of the shrinked file.Both in the previous, as well as in the present case, the measurements are averaged over1 seconds during display. During submission, however, the Csv file is further summed-up down to one result every 8 minutes.

Typical DVD Csv file
Original .CSV
Shrinked .CSV
"DiscTime", "Sum1", "Sum2"  
144,1,0 1,54,0
275,0,0 2,48,0
410,3,0 3,67,0
541,1,0 4,61,0
675,3,0 5,117,0
807,2,0 6,153,0
949,2,0 7,156,0
--------------------
----------------------
2213576,31,0 487,670,0
2213716,44,0 488,565,1
2213860,22,0 489,582,0
2213999,26,0 490,728,0
2214135,23,0 491,737,0
2214274,36,0 492,1102,1
2214411,25,0 493,84,0

11. Some suggestions for the proper submission

Some suggestions for the proper submission of viable Tests

Although many users seem to be aware of the proper handling of tests such as those submitted in our on-line application page (as judged by reading our forum topics), we believe adding some of our own arguments here might help both the starting and the more advanced readers on some simple issues that, despite their simplicity, might affect significantly the validity of a test submission.

Please before posting, perform each test at least twice. Use identical testing conditions (the same PC, identical running software, degree of CPU usage, etc) and try to use discs from the same batch.

Please handle discs with care. Dust or fingertips might easily produce misleading test results. Do not touch the lower surface of a disc, as this is more amenable to scratches and dust particle inclusion that will artificially produce, otherwise unjustified, errors.

Always use as recorders and readers drives that are known to perform well under regular everyday use.

Use older drives only if they keep to perform flawlessly. Please take into account that a to become defective drive shows early signs of its attitude by not recognizing some discs and/or slowing down during parts of the recording/reading process. In general, any abnormal drive behavior should be enough to keep you away from using it for submitting tests.

If you do submit them (it?s always up to you fellow!) you might find out soon that your tests deviate too much with respect to other published tests. In this case your test might seem to some readers misleading. It is a good idea to thus withdraw it or otherwise revise it by using newer or more appropriate equipment.

Please have in your mind that by submitting any tests you help both yourself and our user community towards finding out quality hardware for its needs. As more tests are available and the more reliable they are, the more helpful our application can be both to you and all the rest of us.

You have at your disposal freedom and the power you need in order your work to be seen and we encourage you to attain the best outcome.


12. Measuring the quality of recorded media

Measuring the quality of recorded media

General Considerations

There many factors that influence the quality of a particular media recorded on a particular recorder drive. There are also many ways to identify and qualify these factors as well.

Different media are based on different chemicals and differences in the manufacturing process followed during the making and packaging of a disc might largely influence the outcoming media quality when testing them after recording. For example, the so called yellow dyes (phthalocyanine) used by some manufacturers require less laser power for recording and consequently a different recording strategy in part of the recording drive?s firmware for obtaining optimum quality. On the other hand, discs based on the same chemicals, when produced by different manufacturers have been identified to produce large deviations in quality. (We won?t name any of then here, but it is a good idea for the reader to experiment himself on the quality tests page itself!:)

Even drives from the same manufacturer based on the same model and firmware might produce different quality measurements. This is not strange in itself, all sections of traditional industry have learned to live with such discrepancies for nearly a hundred years now. The industry has also exercised the use of the so-called "statistics" for dealing with these phenomena. A large number of samples, with averaged "quality" measurements over the number of units produced offers an acceptable measure of quality in most cases.

There are many indicators of quality when dealing with CD?s and DVD?s in the labs. These are based on both mechanical and optical measurements of the surface of a recorded disc or a pressed one. Even unrecorded discs can be measured against some specific test for identifying imperfections on the ATIP or the disc geometry itself. All these measurements are of some importance to industry engineers for locating manufacturing process flows or when working towards disc and recorder improvements in research and development groups.

Most of the above measures are "analogue" in their nature. They offer some sort of a numeric value along the changes of a geometric, optical, or otherwise physical property of the medium, the recorder or some other aspect of a process. Examples of such measures is the focus/tracking deviation of the spiral of a disc with respect to the laser diode beam. They are of particular importance for drive and production machine calibration, for developing better algorithms and strategies for encoding and recording, but in the final end, these seem to be of lesser importance to end-user requirements. This might sound as a bold statement, but actually it is not. Please read below.

The final end of any "quality" measurement for a particular user is the playability or not of a particular disc on his/her player under most circumstances. Consequently, when it comes to pure user understanding of quality, of prominent importance becomes not some sort of an analogue measure in general, but the number of reading errors the drive encounters per second during playback.

These errors are either of a minor importance to overall disc readability or constitute a major and thus deterministic factor of reading quality. In the former case these errors, in the case of compact discs, are defined as C1 errors and are only indicative of the quality of the disc. The latter type of errors are called C2 errors and in the case of audio CD?s, when encountered in sufficient consecutive numbers, will most probably result in audible clicks during playback. When encountered in large quantities over a very small amount of time will most probably heard as glitches or not heard at all (muted music due to consumer player built-in algorithms).

This outcome depends on the type of a player and the playback method. For example, consumer players are self-embodied by better C2 error concealing algorithms with respect to PC drives. PC drives, on the other hand, will bypass the built-in error concealing/smoothing algorithms and will produce audible glitches when playback is done digitally (ripping or digital extraction to an outer D/A decoder/booster or an included sound card)!

As long as data CD?s (and PC backup admins) are concerned, there is also a third layer of error correction, applied only on data discs. The errors produced at this level are characterized as L3 (layer-3) errors. Encountering such an error once per disc usually means that at least one backup file is corrupted.

In the case of our tests here, we will not have to deal with such errors at all. We consider already a major event the existence of even isolated cases of C2 errors. The non-existence of errors of this type, already eliminates any possibility of L3 errors.

Implementation details

In the case of our tests, we restrict our selves to only those types of tests that are easily conducted by the average user (or, at least, by those users somewhat more technically inclined). This was the first requirement in our design board. Secondly, a number of other factors influenced our decisions and practices. Each test should be technically well recognized, widely acceptable and easily implementable by products offered in the market at reasonable prices. Since most professional equipment is out of the reach of the budget of the everyday user, we thus restricted our tests to only those offered by some program developers on market drives.

(We can very easily include professional tests if we decide to do so, and we already have included many tests we have performed during our regular drive reviews using our own expert equipment. In the future, all tests and reviews of new hardware will also be appearing in this section of "Media Quality Tests" as well.).

Having expressed the above requirements, we must state that, thus far, we are aware of only 4 such programs. PlexTools Professional, CdSpeed, UMDoctor and KProbe. It is important to point out that not all of these programs work on any available drive. In particular, PlexTools Professional works only with the Plextor Premium (523252), CdSpeed works with those drives based on the Plextor Premium chipset and the MediaTek chipsets, UMDoctor requires a Sanyo chipset with an accompanied firmware and KProbe is restricted in it?s use to only drives based on the MediaTek chipset and firmwares.

We have thus restricted our prospective drive readers to only to only a few dozens of CD and DVD readers drives. But still, we have left open the possibility of using an arbitrary recorder. This is certainly good news, as any user can find out the best disc for his recorder, albeit he will not be able to submit any tests himself!


13. Frequently Asked Questions

Frequently Asked Questions

1. Why some selections in the View Tests page are grayed out?

Depending on the grayed out option, this might happen due to several reasons.

For the case "Enhanced Recording Mode" (RecModeEnhanced), we decided, for the time being, not to offer this option publicly, in order to keep the user selections as simple as possible. We might reconsider this in the future if there is a many members? request for doing so. The same applies in the case of "Enhanced Audio Recording Mode" (RecModeAudioEnhanced).

In the case of the reading Program, we have restricted the available options (presently) to only 2: KProbe and UmDoctor, because currently these are the only programs offering text output in the form of the so-called csv file. Otherwise, each user should have to use a graphics to vector translation application (like the one included with CorelDraw, for example) for doing it. We think there is no much sense into pursuing such an endeavor and hence we have presently disabled the inclusion of tests produced by PlexTools and CdSpeed.

2. I have fixed some selections on some list-boxes and there is no way now for me to add additional tests. Can you please fix the page?

This not a problem of the page. You have just imposed restrictions on the degrees of freedom of the system that lead to tests that do not exist in our database. So, this is a problem related not to the page design itself, but rather to the use of it made by our users.

Please unfix some of the list-boxes (just try one-by-one) until you are led to cases were there are available tests in our database, among which you can make more selections.

3. I choose one particular test for viewing. It shows OK. Then, keeping the same test selection parameters, I choose the [Average] of the available tests. Instead of seeing 2 tests, I see only 1. What's the problem? The same holds when I choose [Best] or [Worst].

You have chosen parameters which lead to only 1 available test. In this case the Average is the same as the test itself. The same holds for the Best and Worst cases. So this is normal, as each curve overlaps another one. In order to be able to get distinguishable curves you will have to choose parameters which lead to more than 1 available tests!

4. I have measured the same disc with the same reader/software, but the results are different. In the first case I chose the 4x reading speed, while in the second one I chose the maximum allowed by the reader. What is the problem?

Considering the fact that the reported results are closely related to the reading quality of your drive, we could say that such results were expected. Higher reading speeds could affect the results, not always negatively, however. For this reason, we have chosen to include the reading speed selection in our test suite, in order the user to be able to identify the speed at which each disc was read at. It is up to you whether you want to compare discs tested at the same speed to keep a reference, or not.

5. Latest versions of KProbe have an "ECC" setting at the left of the main PI/PO window. Which ECC setting should I select from the available 1-10 scale?

There is not any clear answer to this, since the KProbe measuring mechanism is not clearly defined until now. Again, keeping a stable setting for your measurements could make things easier in order to have a reference when comparing the test results. In most cases, CdrInfo.com sets the ECC setting to 8.

It might be the case that this value defines the mesh of the graph, which essentially refers to how distant one measurement lays from the previous and the next ones. In this case choosing a particular scale number is irrelevant. This is due to the fact that before submitting your test results, these are necessarily summed-up per 2 minutes for database storage efficiency.

6. What is the acceptable levels of PI/PO and C1/C2 error rates?

1. Parity Inner and Parity Outer Errors for DVD (ECC=8):

  • DVD PI<280/sec (standard)
  • DVD PI<<280/sec (good quality)
  • DVD PO = 0

2. C1 and C2 errors for CD:

C1=E11+E21+E31

C2=E12, E22 and E32

  • E22- Approaching uncorrectable error
  • E32- Uncorrectable
  • CD E22 and E32 = 0

BLER (Block Error Rate – CD only) = E11 + E21 + E31 per second averaged over ten seconds.

One or more bad bits in any frame

  • CD BLER< 220/sec (standard)
  • CD BLER<< 220/sec (Good quality)

6. KProbe offers 3 different options regarding the address on the CD and DVD. The MSF, LBA and Disc size. What should I choose for measuring my discs?

In general, the "Disc Size" option will work with your CD and DVD media. You should choose this option in order the software to automatically measure the whole disc inserted to your reader. (Please also choose to fill the disc with data completely during recording.)

Generally for a CD:

M:S:F (Minutes:Seconds:Frames) is the original unit used for addressing the positions of a CD. A frame is actually a sector and the unit corresponds to 75 frames in 1 second.

LBA (Logical Block Addressing) is also known as sector of a CD. In the MMC specification document where it is described, there is a defined range for the mapping of MSF to LBA values.

Sectors of the CD are defined as signed long integer values to account for negative values. Here's the relationship:

Positive values are mapped:

MMC LBA:0~403964 to MSF: 00:02:00~89:59:74

and negative values are mapped accordingly:

Below 2 secs (MSF), ranging from 00:00:00 to 00:01:74 (MSF) are represented as MMCLBA: -150 to -1.

The range from 90:00:00 to 99:59:74 (MSF) are represented as MMC LBA: -45150 to -151.

The range of MMC LBA is a linear range with the integer values starting from -45150 to 403964. However, it is not linear comparing with Red Book Addressing (MSF) because it wraps around.

Logical Block Addressing are the mapped integer numbers from CD Red Book Addressing to MSF. It is also known as sectors of a CD. Other people may have different interpretations for LBA, or simpler:

The starting sector is -150 and ending sector is 449849, which correlates directly to MSF: 00:00:00 to 99:59:74. This removes the wrapping as in MMCLBA representation and the transformation will be linear when comparing with the Red Book Addressing (MSF).


14. Glossary of Terms

Glossary Terms

Below is a list of terms (in alphabetical order) that appear in this guide and might be of help to our readers. Please ask us to add more terms by submitting the form here.

ECC

Error Correction Code

MSB

Most Significant Byte

LSB

Least Significant Byte

Channel bit (DVD)

The elements by which, after modulation, the binary values ZERO and ONE are represented on the disk by pits.

Physical sector number (DVD)

A serial number allocated to physical sectors on the disk.

Track

A 360° turn of a continuous spiral.

Track pitch

The distance between the centrelines of a pair of adjacent physical tracks, measured in radial direction.

DVD Disc Types

Type A

Consists of a substrate, a single recorded layer and a dummy substrate. The recorded layer can be accessed from one side only. The nominal capacity is 4,7 Gbytes.

Type B

Consist of two substrates, and two recorded layers. From one side of the disk, only one of these recorded layers can be accessed. The nominal capacity is 9,4 Gbytes.

Type C

Consists of a substrate, a dummy substrate and two recorded layers with a spacer between them. Both recorded layers can be accessed from one side only. The nominal capacity is 8,5 Gbytes.

Type D

Consists of two substrates, each having two recorded layers with a spacer between these two recorded layers. From one side of the disk, only one pair of recorded layers can be accessed. The nominal capacity is 17,0 Gbytes.

Track modes

Tracks can be recorded in two different modes called Parallel Track Path (PTP) and Opposite Track Path (OTP). In practice, the lengths of the Data Zones of both layers are independent from each other.

In PTP mode, tracks are read from the inside diameter of the Information Zone to its outside diameter, this applies to both Layer 0 and Layer 1 for Types C and D. On both layers, the track spiral is turning from the inside to the outside.

In OTP mode, tracks are read starting on Layer 0 at the inner diameter of the Information Zone, continuing on Layer 1 from the outer diameter to the inner diameter. Thus, there is a Middle Zone at the outer diameter on both layers, see figure 5b. The track spiral is turning from the inside to the outside on Layer 0 and in the reverse direction on Layer 1.

PO

Parity (of the) Outer (code)

PI

Parity (of the) Inner (code)

DVD ECC Blocks

ECC of DVD block is a self-contained block of data and error correction codes that grouped into a sequential series of 16 DVD sectors.

An ECC Block is formed by arranging 16 consecutive Scrambled Frames in an array of 192 rows of 172 bytes each. To each of the 172 columns, 16 bytes of Parity of Outer Code are added, then, to each of the resulting 208 rows, 10 byte of Parity of Inner Code are added. Thus a complete ECC Block comprises 208 rows of 182 bytes each.

PI/PO

Type of measured errors for DVD's. The I corresponds to "inner", the O to "outer". This is a distinction being made when each sector along with the error correction data is presented in a tabular form. The I's and O's then correspond to the 2 different directions of the 2-dimension Euclidean space (plane).

Recording Frames

Sixteen Recording Frames shall be obtained by interleaving one of the 16 PO rows at a time after every 12 rows of an ECC Block Thus the 37,856 bytes of an ECC Block are re-arranged into 16 Recording Frames of 2,366 bytes. Each Recording Frame consists of an array of 13 rows of 182 bytes.

Random errors (DVD)

A byte error occurs when one or more bits in a byte have a wrong value, as compared to their original recorded value. A row of an ECC Block that has at least 1 byte in error constitutes a PI error. If a row of an ECC Block contains more than 5 erroneous bytes, the row is said to be “PI-uncorrectable”. During playback after the initial recording, the errors as detected by the error correction system shall meet the following requirements:

- in any 8 consecutive ECC Blocks the total number of PI errors before correction shall not exceed 280,

- in any ECC Block the number of PI-uncorrectable rows should not exceed 4.

Jitter (DVD)

Jitter is the standard deviation s of the time variation of the digitized data passed through the equalizer. The jitter of the leading and trailing edges is measured to the PLL clock and normalized by the Channel bit clock period.

Sector

In case of CD media, “Sector” refers to the data contained in one frame. In the CD-ROM standard document the term block is used for this unit. In the case of DVD media, “Sector” is the smallest user addressable part of media. The user data contained within a sector is 2048 bytes.

Physical Sectors (DVD)

The structure of a Physical Sector shall consist of 13 rows, each comprising two Sync Frames. A Sync Frame shall consist of a SYNC Code from table 4 and 1,456 Channel bits representing the first,respectively the second 91 8-bit bytes of a row of a Recording Frame. The first row of the Recording Frame is represented by the first row of the Physical Sector, the second by the second, and so on. Recording shall start with the first Sync Frame of the first row, followed by the second Sync Frame of that row, and so on row-by-row.

Data Frames (DVD)

A Data Frame shall consist of 2 064 bytes arranged in an array of 12 rows each containing 172 bytes. The first row shall start with three fields, called Identification Data (ID), the check bytes of the ID Error Detection Code (IED), and Copyright Management Information (CPR_MAI), followed by 160 Main Data bytes. The next 10 rows shall each contain 172 Main Data bytes, and the last row shall contain 168 Main Data bytes followed by four bytes for recording the check bits of an Error Detection Code (EDC). The 2,048 Main Data bytes are identified as D0 to D2047.

Reed-Solomon code

An error detection and/or correction code for the correction of errors.

BLER

Block Error Rate. The total number of C1 errors per 10 consecutive sectors. It should be below 200, for perfectly readable disc.

C1/C2

Type of errors during reading a recordable or pressed disc. It is evaluated per sector and presented (usually) averaged or summed-up over a number of sectors. (This gives rise to the various BLER measurements.)

CD-R

A glorious recordable disc:)

DVD-R

The recordable DVD Forum defined format. The term might also refer to the media or a drive capable of handling (recording) these.

DVD-RW

Re-writable format/disc/..., according to the DVD Forum specifications.

KProbe

A program for detecting reading errors that works on drives (readers, recorders) based on the MediaTek chipset.

LBA

Logical Block Address. The disc is measured (addressed) according to 1/75 of the second intervals (sector, frame), in 1x speed.

MSF (CD)

MSF Address (Minute/Second/Frame) The physical address, expressed as a sector count relative to either the beginning of the medium (absolute) or to the beginning of the current track (relative). As defined by the CD standards, each F field unit is one sector, each S field unit is 75 F field units, each M field unit is 60 S field units. Valid contents of F fields are binary values from 0 through 74. Valid contents of S fields are binary values from 0 through 59. Valid contents of M fields are binary values from 0 through 74.

For example 63m:45s:70f. An alternative disc addressing scheme. It is essentially the same with the LBA method, in case you divide LBA by 75 (the number of frames per second) and then transform the seconds to minutes by dividing by 60. In the literature there is an arbitration of a 2 second track lead-in.

UmDoctor

A program by Sanyo for estimating and writing down disc reading errors (in the form of C1/C2 for CD's and PI/Unc-PI for DVD's). It works on drives based on Sanyo chipsets. In case of DVD, the software measures PI/Unc-PIU errors on each 8 ECC.

Groove

A trench-like feature of the disk, applied before the recording of any information, and used to define the track location. The groove is located nearer to the entrance surface than the land. The recording is made on the centre of the groove.

Land

The area between the grooves.

CIRC

Cross Interleaved Reed-Solomon Code (CIRC) is the error detection and correction technique used within small frames of CD audio or data. The CIRC bytes are present in all CD-ROM data modes. The error correction procedure which uses the CIRC bytes is referred to as the CIRC based algorithm. In most CD-ROM drives, this function is implemented in hardware.

Frame (CD)

A sector on CD media. Also the F field unit of a MSF CD address. The smallest addressable unit in the main channel.

DVD Control Area

The DVD Control area is comprised of 192 ECC blocks in the Lead-in Area of a DVD medium. The content of 16 sectors in each block is repeated 192 times. This area contains information concerning disc.

Lead-in Area

The CD Lead-in area is the area on a CD-ROM disk preceding track one. The area contains the TOC data and precedes each program area. The main channel in the lead-in area contains audio or data null information. This area is coded as track zero but is not directly addressable via the command set. The Q sub-channel in this area is coded with the table of contents information.

The DVD Lead-in area is the area comprising physical sectors 1.2 mm wide or more adjacent to the inside of the Data area. The area contains the Control data and precedes the Data area.

Lead-out Area

The CD Lead-in area is the area on a CD-ROM disk beyond the last information t rack. The main channel in the leadout area contains audio or data null information. This area is coded as track AAbcd but is not directly addressable via the command set. The READ CD-ROM CAPACITY data is the first logical block address of this area minus one.

The DVD Lead-out area is the area comprising physical sectors 1.0 mm wide or more adjacent to the outside of the data area in single layered disc for PTP (Parallel Track Path) disc, or area comprising physical sectors 1.2 mm wide or more adjacent to the inside of the data area in layer 1 of OTP (Opposite Track Path) disc.

DVD Reference Code

The DVD Reference code is comprised of 2 ECC blocks (32 sectors) in the Lead-in Area and used for the adjustment of equalizer system of the drive hardware.


15. Programming decisions

Programming decisions for developing the Media Quality Tests

During the preparation of this section of our site, we understood that the user interface would play a significant role in the satisfaction of our readers while they would be involved in querying our database for quality tests.

From many respects pure html is inadequate in handling highly interactive scenarios as that we felt our solution represented . We had thus have to make some decisions in advance.

Instead of having each user press a dozen of buttons on scattered pages for finding the tests he wanted to find, we felt we had to offer him a seemingly simple approach of the feel and taste of an interactive (windows) program.

All options should be available under a single page. All interactions with the page controls (buttons, list-boxes, etc) should not distract the user attention by imposing postbacks to the server, at least not in a viewable way:) Taking a look at both the initial and final state of the main Test Page (see below), shows that we have succeeded into devising the illusion that the user is actually working on a "pure" (windows) program. In fact, any such program, in reality, would be as fast as our Dhtml code, not a single second faster!

In the above picture (showing the selections leading to a particular test) we see that the user interface has been kept constant throughout all user interactions with the page.

In order to achieve all this, we had to make some browser compatibility sacrifices. We could achieve the same result by either using iframes or webservices/remote-scripting. In all cases the use of Dhtml is essential for a technically acceptable outcome. In the latter case we would have to restrict our audience to users of Internet Explorer 5.5 or later. In the former case, in general any series 4 or newer browser might be able to handle our page.

We have not tested (yet) thoroughly the compatibility of our solution with the latest Netscape/Mozilla and Opera browsers versions. This would be beyond our site?s capabilities (both in terms of personnel and necessary work-hours), for the time being. We will try to improve, in the future, compatibility with these browsers, in case the lack of standards implementation weights more towards our side.

(Given the record of thin standard Dhtml implementation in both the cases of these 2 genres of browsers (NS and Opera), however, might limit our efforts for reaching a wider browser use.)

View Page criticism

We have been hearing some persons arguing about the method we chose for displaying of the tests. We accept and understand each and every one of them. Please before sending us you opinion, try to think of a different way of arranging the user interface that takes into account all aspects of the needs the View Page has to serve. We expect this way you own opinion will come closest to the one we finally adopted. If, however, this is not the case, we are glad to hear it by sending an e-mail to us here with the appropriate subject.


16. APPENDIX 1. UmDoctor Pro II

APPENDIX 1. UMDoctor PRO II

In this and the following apprentices you can find some useful information on the 4 particular programs that are able to reproduce C1/C2 and/or PI/PO errors on regular production level recorder drives. We include a brief program description, information about the features and possible restrictions, especially those concerning our case of quality tests.

At the end of each page you can find a table with the drives compatible with each program, as long as error estimation in the form of C1/C2 and PI/PO is concerned.

UMDoctor Pro. II is an accurate and easy to use program. Within just a few steps the user is able to test the C1/C2 errors for a written CD-R and the PI/UncoractablePI errors of a DVD or DVD-R. For this demonstration we will use a CD-R disc.

This is the main screen of the UMDoctor Pro II. Lets take a look to the settings by pressing the settings button:

On this window we choose the test device we want to use. Note, that the software works only with DVD recorders that support the Sanyo's HD-BURN feature, which are the Optorite DD0401, DD0201 and DD0203. Here, we can also select the type of data displayed in the graph (PI, Uncorrectable-PI, C1 and C2). User is also able to adjust the accuracy of the graph in relation to the data addresses (fine, rough).

According to Sanyo, the UMDoctor PRO II measures PI and Uncorrectable-PI errors for every 8 ECC.

We press the Go Button and the test process starts

After the end of the test we do a right click on the graph area and from the menu we choose "Save As Data" (*.dat) and we choose the destination folder.

For the DVD test, just follow the same steps, UMDoctor Pro II will automatically recognize the inserted DVD and adjust the settings for you.

As pointed out UMDoctor Pro II works with Optorite DD0201, DD0203 and DD0401 drives. Newer versions of the software will add support for more drives. The UMDoctor PRO II is available in the Sanyo website.


17. APPENDIX 2. KProbe

APPENDIX 2. KProbe

With this software you can measure the writing quality on both CD/DVD media, the C1-C2 error rates on a CD or the PI/PO levels on a DVD.

It is very simple. Be sure that you are using a LiteOn drive (CD-RW, DVD recorder or CD-RW combo), otherwise you won't be able to use KProbe. Insert the media and open (run) the program. Check the "Write Strategy" tab.

We suggest you to adjust the measurement settings as illustrates below, the first is for DVD and the second for a CD. The "Realtime Chart" is not necessary but you can see in realtime the whole process.

In both the cases of a CD and DVD, it is recommended to to choose the "Disc Size" option. The software will automatically read the whole disc without further adjustments by user. Optionally, the user is able to manually set the measuring margins, by setting the MSF or LBA starting and ending addresses. More information about the MSF and LBA are available in the FAQ section. For submitting your own tests it suffices to choose the whole disc option, provided you have also made the disc full of data during recording.

The selected reading speed is sometimes critical for the reliability of the results. In case of both CD's and DVD's, you can select either to manually enter the reading speed, or check the max speed option. When you select the max option, the software will scan your CD/DVD for errors at the maximum reading speed available by the reader you use. However, testing the same disc at the maximum and a lower speed do not always give the same results. For this reason, we ask users to clearly define the reading speed they have selected for their measurements, during the "Submit Test" page. After all, everyone will be able to compare results with the same or different error scanning speed.

In case of DVD's, the user is able to select the ECC value (1-10). Setting different ECC for each scanning gives different error level values. As in the previous case, keeping the same ECC value for your scannings will give more comparable results.

As soon as you press the "Start" button the process will begin. If you have inserted a CD media you will see C1/C2 measurement. Otherwise, with a DVD, you will get the PI/PO errors.

When the procedure finishes, you will see a chart as this one below.

Now, you have to save the results. Just press the save icon, this with the disc on it, and you will be asked to select which chart you want to save. Check both ( as below) and press OK.

 

Give a short name to the file and set it to .csv type (as you can see in the photo above) so you will be able to use it in the CdrInfo media quality. Press "Save". Please do not use extra periods for the file-name you choose. This will be troublesome during submission of the tests, because there are not allowed more than one periods (for .csv) by our submitting application.


18. APPENDIX 3. CdSpeed

APPENDIX 3. CdSpeed

For CD quality measurements of the type discussed in our review, you will need the Nero CDSpeed v2.11 or higher. Notice that you can only measure CD media at this time. Unfortunately, the software does not allow to save the quality results in a *.csv or *.dat format. Hence you can not presently use this program for submitting your own results. However, we hope Ahead will add support for this type of file-saving, along with support for DVD checking (PI/PO) in the immediate future.

Run the program and select its "CD Quality Test..." option, in the "Extra" drop-down menu, as shown below.

As soon as you do this, you will see the next screen:

Check "Report C1 errors", if you want to see the C1 error level in the graph. You must check the "Show speed" option if you want to include the speed graph. Adjust the speed and press Start.

Before the process begins, you will have to wait for the drive to spin up.

 

 

Below is a complete quality test with a data CD. The green line is the speed while the blue and the yellow refers to the C1 and C2 error-levels respectively.

If you want to adjust the setting, you can easily do it from the File menu and the "Options", as seen below.

Check the CD Quality on the left and adjust the setting according to your needs. Change the graph limit for errors and the maximum speed. You can also select a color of your choice for the speed, C1 and C2, by pressing on each color and adjusting accordingly.

CdSpeed works with Plextor Premium and LiteOn CD-RW recorders, as readers.


19. APPENDIX 4. PlexTools Professional

APPENDIX 4. PlexTools Professional

Plextools Professional offers detailed quality scannings for CD's. The user is able to run the Q-Check C1/C2 test, the FE/TE (Focus-Tracking Errors) test and the Beta/Jitter tests. Note that the software works only with the Plextor Premium CD-RW drive, and as in case of CdSpeed, the user is not able to save the results in a numerical format. (We expect this behavior to change in the foreseeable future.)

The following picture shows the main screen of the PlexTools professional program.

The C1/C2 read test applies to both written CD-R/RW media and stamped CD's. The test will measure the quality of the written or pressed media by counting the number or low-level errors. Note, that most of these errors (C1) are correctable by the drives in the case of absence of C2 errors. The software offers C1, C2 and CU (Uncorrectable) error measurements.

The C1 values indicate BLER (Block Error Rate), which is the number of E11+E21+E31 errors.

C2 indicates the number of E22

CU indicates the number of E32.

More information about the nature of these errors is available in our FAQ section. In order to achieve reliable results independent of the influence of the drive's performance, the tests should be performed at the lowest speeds. You are free, however, to use speeds near the top available, provided the disc is well balanced. Noisy discs during the tests should be avoided, as the consistency of the tests over repetition will be questionable.

We have selected the Q-Check C1/C2 Test. The first step is to select the device we are going to use (reader):

There is no need to change options for this measurement. Choose the Q-Check C1/C2 Test mode and just make sure that the C1,C2 and CU checkbox are checked.

We choose the reading test speed and we press Start:

When the test ends, there is a pop up window informing the user about the C1 and C2 error but we don't actually need that window, so just press close. To save the graph right click on the graph area and choose "Save as...", then select the destination folder.

For the Beta/Jitter Test we select the appropriate option. The test applies to written CD-R/RW's. Below, we will check the write quality of a disc by checking the jitter rate and the Beta value. The Beta value is an indication how well the recorded signal pits and lands are balanced in (asymmetry). An horizontal line would mean perfect balance and would give the best readability result.

The Jitter function, as it is currently estimated by this program, shows the average jitter rate in the 3T~11T range. A lower jitter will indicate better results for readability and less chances on (un)correctable errors.

More information about Beta/Jitter is available in our FAQ section.

We select the Q-Check Beta/jitter Test as illustrated below:

We make sure that the "Show Beta" and "Show Jitter" checkboxes are marked and we press the start to begin the test

The method of saving the test results is identical to the previous case of the C1/C2 test, just right click on the graph and from the popup menu choose "Save as..." to the the destination folder.

Q-Check FE/TE test

This is a write test for blank CD-R/RW media. It will measure the mechanical characteristics of the media. FE (Focus Errors) indicates how well the pickup can focus the laser beam on the disc surface. TE (Tracking Errors) indicates how well the pickup can follow the spiral track of a disc.

None of the above tests offer any numerical output, and thus, they are not included in our test results suite, at least for now.



Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2014 - All rights reserved -
Privacy policy - Contact Us .