$ ls -l total 7070588K -rw-r--r-- 1 jd jd 3616604160 Oct 1 23:27 f14.iso -rw-r--r-- 1 jd jd 1602 Sep 29 20:35 Fedora-14-Beta-i386-CHECKSUM -rw-r--r-- 1 jd jd 3616593920 Sep 29 21:56 Fedora-14-Beta-i386-DVD.iso
I burned Fedora-14-Beta-i386-DVD.iso to blank dvd. One time I used wodim, and 2nd time I used growisofs. After each burn, I would dd the dvd back in:
dd if=/dev/sr0 of=f14.iso bs=2k
In both cases the size of f14.iso was 10240 bytes larger than Fedora-14-Beta-i386-DVD.iso.
So, naturally, I will not be able to get the same sha256 sum as that of Fedora-14-Beta-i386-DVD.iso
So, how do I dd the dvd in and get the same size as original iso?
Perhaps growisofs and wodim are padding during burn. If that is the case, how do I suppress it?
JD <jd1008 <at> gmail.com> writes:
I burned Fedora-14-Beta-i386-DVD.iso to blank dvd. One time I used wodim, and 2nd time I used growisofs. After each burn, I would dd the dvd back in:
dd if=/dev/sr0 of=f14.iso bs=2k
In both cases the size of f14.iso was 10240 bytes larger than Fedora-14-Beta-i386-DVD.iso.
In general this won't work unless dd is told exactly how much to read off. Unfortunately, it works often enough that people think it's supposed to. Use the rawread script from
http://www.troubleshooters.com/linux/coasterless.htm#rawread
instead. This determines the ISO size from the ISO header and then automatically reads off the right amount.
There is a defect in the Live and netinst images (which started in F12 Alpha, and only affected i386 and x86_64, not ppc), in that they are larger than the ISO header indicates:
https://bugzilla.redhat.com/show_bug.cgi?id=585006
To read these off properly, the rawread script won't work, since it assumes that the ISO header size is correct. Instead, you have to manually compose a dd command that reads off the size corresponding to the actual ISO file (not the size in the ISO header).
On Sat, Oct 2, 2010 at 4:40 AM, Andre Robatino robatino@fedoraproject.orgwrote:
JD <jd1008 <at> gmail.com> writes:
I burned Fedora-14-Beta-i386-DVD.iso to blank dvd. One time I used wodim, and 2nd time I used growisofs. After each burn, I would dd the dvd back in:
dd if=/dev/sr0 of=f14.iso bs=2k
In both cases the size of f14.iso was 10240 bytes larger than Fedora-14-Beta-i386-DVD.iso.
In general this won't work unless dd is told exactly how much to read off. Unfortunately, it works often enough that people think it's supposed to. Use the rawread script from
http://www.troubleshooters.com/linux/coasterless.htm#rawread
instead. This determines the ISO size from the ISO header and then automatically reads off the right amount.
There is a defect in the Live and netinst images (which started in F12 Alpha, and only affected i386 and x86_64, not ppc), in that they are larger than the ISO header indicates:
https://bugzilla.redhat.com/show_bug.cgi?id=585006
To read these off properly, the rawread script won't work, since it assumes that the ISO header size is correct. Instead, you have to manually compose a dd command that reads off the size corresponding to the actual ISO file (not the size in the ISO header).
One can also tell dd to read by single physical layer sectors using the "bs=2048" option. This may increase the read time a bit, but generally results in s more consistent chance of getting the correct size.
Gregory Woodbury <redwolfe <at> gmail.com> writes:
One can also tell dd to read by single physical layer sectors using the
"bs=2048"option. This may increase the read time a bit, but generally results in s more consistent
chance of getting the correct size.-- G.Wolfe Woodbury
The rawread script does this - it reads both the block size (normally 2048) and the block count from the ISO header, and composes a dd command that looks something like
dd if=/dev/dvd bs=2048 count=1765915 conv=notrunc,noerror
for a 3616593920-byte ISO (assuming the information in the ISO header is correct, which it currently is for Fedora install DVD and CD images, but not Live or netinst). JD's original command also used the "bs=2k" option which is equivalent (but not the count option). The dd man page only documents upper-case K for 1024, but I tested and lower-case k works the same.
On 10/02/2010 01:40 AM, Andre Robatino wrote:
JD<jd1008<at> gmail.com> writes:
I burned Fedora-14-Beta-i386-DVD.iso to blank dvd. One time I used wodim, and 2nd time I used growisofs. After each burn, I would dd the dvd back in:
dd if=/dev/sr0 of=f14.iso bs=2k
In both cases the size of f14.iso was 10240 bytes larger than Fedora-14-Beta-i386-DVD.iso.
In general this won't work unless dd is told exactly how much to read off. Unfortunately, it works often enough that people think it's supposed to. Use the rawread script from
http://www.troubleshooters.com/linux/coasterless.htm#rawread
instead. This determines the ISO size from the ISO header and then automatically reads off the right amount.
There is a defect in the Live and netinst images (which started in F12 Alpha, and only affected i386 and x86_64, not ppc), in that they are larger than the ISO header indicates:
https://bugzilla.redhat.com/show_bug.cgi?id=585006
To read these off properly, the rawread script won't work, since it assumes that the ISO header size is correct. Instead, you have to manually compose a dd command that reads off the size corresponding to the actual ISO file (not the size in the ISO header).
Thank you Andre. Very good info!!
JD
On 10/02/2010 04:03 AM, Gregory Woodbury wrote:
On Sat, Oct 2, 2010 at 4:40 AM, Andre Robatino <robatino@fedoraproject.org mailto:robatino@fedoraproject.org> wrote:
JD <jd1008 <at> gmail.com <http://gmail.com>> writes: > I burned Fedora-14-Beta-i386-DVD.iso to blank dvd. > One time I used wodim, and 2nd time I used growisofs. > After each burn, I would dd the dvd back in: > > dd if=/dev/sr0 of=f14.iso bs=2k > > In both cases the size of f14.iso was 10240 bytes larger than > Fedora-14-Beta-i386-DVD.iso. In general this won't work unless dd is told exactly how much to read off. Unfortunately, it works often enough that people think it's supposed to. Use the rawread script from http://www.troubleshooters.com/linux/coasterless.htm#rawread instead. This determines the ISO size from the ISO header and then automatically reads off the right amount. There is a defect in the Live and netinst images (which started in F12 Alpha, and only affected i386 and x86_64, not ppc), in that they are larger than the ISO header indicates: https://bugzilla.redhat.com/show_bug.cgi?id=585006 To read these off properly, the rawread script won't work, since it assumes that the ISO header size is correct. Instead, you have to manually compose a dd command that reads off the size corresponding to the actual ISO file (not the size in the ISO header).One can also tell dd to read by single physical layer sectors using the "bs=2048" option. This may increase the read time a bit, but generally results in s more consistent chance of getting the correct size.
-- G.Wolfe Woodbury
So, what block size do you think I am using in my command:
dd if=/dev/sr0 of=f14.iso bs=2k
It has no effect. As Andre states, the correct size is in the in the header info. The actual physical size on the DVD seems to have been padded by 10240 bytes (not sure if that is done during dvd burn, or during dvd read), and the script Andre pointed me (us) to does the right thing.
Thanx again, Andre!
On Sat, 2010-10-02 at 10:52 -0700, JD wrote:
The actual physical size on the DVD seems to have been padded by 10240 bytes (not sure if that is done during dvd burn, or during dvd read),
When one uses features like "burnproof" so that the burning process can pad out buffer underruns, and keep on burning until more data comes through, instead of halting, is that transparent to the (later) reading of the disc, or a likely cause of this sort of problem?
On 10/02/2010 04:40 PM, Tim wrote:
On Sat, 2010-10-02 at 10:52 -0700, JD wrote:
The actual physical size on the DVD seems to have been padded by 10240 bytes (not sure if that is done during dvd burn, or during dvd read),
When one uses features like "burnproof" so that the burning process can pad out buffer underruns, and keep on burning until more data comes through, instead of halting, is that transparent to the (later) reading of the disc, or a likely cause of this sort of problem?
I have no idea because I did not enable burnproof option, nor overburn. I was under the impression that the -pad option was for burning audio CD's only.
Tim <ignored_mailbox <at> yahoo.com.au> writes:
On Sat, 2010-10-02 at 10:52 -0700, JD wrote:
The actual physical size on the DVD seems to have been padded by 10240 bytes (not sure if that is done during dvd burn, or during dvd read),
When one uses features like "burnproof" so that the burning process can pad out buffer underruns, and keep on burning until more data comes through, instead of halting, is that transparent to the (later) reading of the disc, or a likely cause of this sort of problem?
Due to the linux readahead bug, which has never been fixed, padding is a Good Thing (as long as there's a little extra space on the disc to allow it). I always use the isopad script at
http://ftp.cs.utoronto.ca/pub/hugh/isopad
before using growisofs to burn discs. When using wodim, its -pad option can be used. (The -pad option for growisofs is not the same and in fact can't be used when burning an existing ISO file.) The proper way to avoid problems with read size is to always specify the correct size in the dd command - which the rawhide script does, assuming the ISO header information is correct, otherwise it has to be done manually. If the size isn't specified, whether it works is hardware-dependent.
Tim:
When one uses features like "burnproof" so that the burning process can pad out buffer underruns, and keep on burning until more data comes through, instead of halting, is that transparent to the (later) reading of the disc, or a likely cause of this sort of problem?
Andre Robatino:
Due to the linux readahead bug, which has never been fixed, padding is a Good Thing (as long as there's a little extra space on the disc to allow it).
I was thinking of the schemes that will insert padding in the middle of a burn, so the burn process can just keep on going, instead of having to pause (which a DVD+ disc can do), or abort completely (for media that can't simply pause burning). Rather than thinking about the end of the disc padding, so that discs end up with a particular desired size, or padding so that they don't have random data past the end.
If the burning process had to pad out the middle with say 60 megs of padding space, that could (a) affect checksumming, and (b) cause you problems if you expected to read 650 megs of data off a disc by reading only the *first* 650 megs worth of bits on it. This is presuming that padding in the middle isn't transparent (i.e. done in a way that isn't noticeable when the disc is, later on, read).