Project Home
Project Home
Documents
Documents
Wiki
Wiki
Discussion Forums
Discussions
Project Information
Project Info
Forum Topic - Large file giving out various cksum result on each run: (5 Items)
   
Large file giving out various cksum result on each run  
In running a QNX 4.25 on a System  with A Dual Core Intel CPU E6400 2.13 Ghz.
I can install the Qnx software without any problem and the systems seems to run just fine.
However  when I tried to decompress .tar.F file of 200 Megs, I am getting an error during the process where it asks me 
to insert the next part of the archive.

I tested the archive was restorable on a different unit.
Checking the transfered file with cksum I discovered that the cksum utility is giving out a different cksum result each 
time it is ran on that large file.
The size reported of the file is however is matching exactly.

If I do a cksum on any other file I seem to be getting the proper result all the time.

We first suspected the 500 gb Sata drive and the latest Fsys.atapi driver so we redid the test with the same unit with a
 10 Gb EIDE drive.  runing Fsys.eide.  The result were the same.

We are not quite sure be we suspect that the Dual core CPU may be causing the issue but we are not to sure what should 
be tried.

has anyone experience and found any solution to any similar issue ?
Thanks
RE: Large file giving out various cksum result on each run  
 

> -----Original Message-----
> From: Remi Duquet [mailto:community-noreply@qnx.com] 
> Sent: February 26, 2009 2:55 PM
> To: general-filesystems
> Subject: Large file giving out various cksum result on each run
> 
> In running a QNX 4.25 on a System  with A Dual Core Intel CPU 
> E6400 2.13 Ghz.
> I can install the Qnx software without any problem and the 
> systems seems to run just fine.
> However  when I tried to decompress .tar.F file of 200 Megs, 
> I am getting an error during the process where it asks me to 
> insert the next part of the archive.
> 
> I tested the archive was restorable on a different unit.

How did you test this?  Did you copy the archive from the failed machine
to another machine?  Or did you download the archive on the other
machine directly?

> Checking the transfered file with cksum I discovered that the 
> cksum utility is giving out a different cksum result each 
> time it is ran on that large file.
> The size reported of the file is however is matching exactly.
> 
> If I do a cksum on any other file I seem to be getting the 
> proper result all the time.

Are any of the other files as large as the .tar.F archive?

> 
> We first suspected the 500 gb Sata drive and the latest 
> Fsys.atapi driver so we redid the test with the same unit 
> with a 10 Gb EIDE drive.  runing Fsys.eide.  The result were the same.
> 
> We are not quite sure be we suspect that the Dual core CPU 
> may be causing the issue but we are not to sure what should be tried.

QNX4.x has no support for SMP.  It should only be running one processor.
Does the BIOS on your computer allow disabling the second core?  It
should not make a difference, but it's also an easy thing to try.

> 
> has anyone experience and found any solution to any similar issue ?
> Thanks
> 
> _______________________________________________
> General
> http://community.qnx.com/sf/go/post23016
> 
> 
RE: Large file giving out various cksum result on each run  
I've seen this but with networking. The data was getting corrupted when being move from the network card to system 
memory.  Only showed when doing large file transfer.  

> -----Original Message-----
> From: David Sarrazin [mailto:community-noreply@qnx.com]
> Sent: February-26-09 3:20 PM
> To: general-filesystems
> Subject: RE: Large file giving out various cksum result on each run
> 
> 
> 
> > -----Original Message-----
> > From: Remi Duquet [mailto:community-noreply@qnx.com]
> > Sent: February 26, 2009 2:55 PM
> > To: general-filesystems
> > Subject: Large file giving out various cksum result on each run
> >
> > In running a QNX 4.25 on a System  with A Dual Core Intel CPU
> > E6400 2.13 Ghz.
> > I can install the Qnx software without any problem and the
> > systems seems to run just fine.
> > However  when I tried to decompress .tar.F file of 200 Megs,
> > I am getting an error during the process where it asks me to
> > insert the next part of the archive.
> >
> > I tested the archive was restorable on a different unit.
> 
> How did you test this?  Did you copy the archive from the failed
> machine
> to another machine?  Or did you download the archive on the other
> machine directly?
> 
> > Checking the transfered file with cksum I discovered that the
> > cksum utility is giving out a different cksum result each
> > time it is ran on that large file.
> > The size reported of the file is however is matching exactly.
> >
> > If I do a cksum on any other file I seem to be getting the
> > proper result all the time.
> 
> Are any of the other files as large as the .tar.F archive?
> 
> >
> > We first suspected the 500 gb Sata drive and the latest
> > Fsys.atapi driver so we redid the test with the same unit
> > with a 10 Gb EIDE drive.  runing Fsys.eide.  The result were the
> same.
> >
> > We are not quite sure be we suspect that the Dual core CPU
> > may be causing the issue but we are not to sure what should be tried.
> 
> QNX4.x has no support for SMP.  It should only be running one
> processor.
> Does the BIOS on your computer allow disabling the second core?  It
> should not make a difference, but it's also an easy thing to try.
> 
> >
> > has anyone experience and found any solution to any similar issue ?
> > Thanks
> >
> > _______________________________________________
> > General
> > http://community.qnx.com/sf/go/post23016
> >
> >
> 
> _______________________________________________
> General
> http://community.qnx.com/sf/go/post23018
> 
Re: RE: Large file giving out various cksum result on each run  
How did I test:

I transfered a file with FTP from the backup server.
I compared the file size which match precisely ( in terms of byte size )

Indeed a tranfer can result into a corrupt  file which would then show the chksum value to be wrong.  However in this 
specific case, not only is it wrong, but the chksum value shows a different value each time the command is ran for that 
same file.  
This seems to only occurs on large files ( I also checked with other large file I had on disk. ) 

I  did not attempt to test at what size point of a file this behaviour starts to occurs.
the fact that this value is chaning each time cannot be only caused by the transfer.  The chksum utility should give the
 same result after each run  wheter the result is good or bad.

As to try to disable the dual core functionality.
I didn't find any option in the CMOS that allows doing that.  I wish there was such a thing.

I was hoping that this may has been a known issue in the QNX4 environment.



RE: RE: Large file giving out various cksum result on each run  

> -----Original Message-----
> From: Remi Duquet [mailto:community-noreply@qnx.com]
> Sent: March-16-09 12:38 PM
> To: general-filesystems
> Subject: Re: RE: Large file giving out various cksum result on each run
> 
> How did I test:
> 
> I transfered a file with FTP from the backup server.
> I compared the file size which match precisely ( in terms of byte size
> )
> 
> Indeed a tranfer can result into a corrupt  file which would then show
> the chksum value to be wrong.  However in this specific case, not only
> is it wrong, but the chksum value shows a different value each time the
> command is ran for that same file.
> This seems to only occurs on large files ( I also checked with other
> large file I had on disk. )
> 

Most probably a hardware issue.  Did you run the test on more than one computer.

> I  did not attempt to test at what size point of a file this behaviour
> starts to occurs.
> the fact that this value is chaning each time cannot be only caused by
> the transfer.  The chksum utility should give the same result after
> each run  wheter the result is good or bad.
> 
> As to try to disable the dual core functionality.
> I didn't find any option in the CMOS that allows doing that.  I wish
> there was such a thing.
> 
> I was hoping that this may has been a known issue in the QNX4
> environment.
> 
> 
> 
> 
> 
> _______________________________________________
> General
> http://community.qnx.com/sf/go/post24463
>