Identifying Bad Memory

I was having problems recently with a dedicated production server, that runs my MySQL Server and a number of websites. It’s most annoying when your system crashes without any reporting in /var/log/messages

The tool of choice from the host provider SoftLayer was PassMark BurnInTest Linux which is installed with every dedicated server.

I will need to investigate open source alternatives, as this is a commercial product, but for the purposes of my pain, this included tool was well worth the investment.

**************
RESULT SUMMARY
**************
Test Start time: Sun Feb 22 16:02:48 2009
Test Stop time: Sun Feb 22 16:07:49 2009
Test Duration: 000h 05m 01s

Test Name Cycles Operations Result Errors Last Error
CPU - Maths 261 488 Billion PASS 0 No errors
Memory (RAM) 2 3.081 Billion FAIL 1 Error verifying data in RAM
Network: 127.0.0.1 412995 4.295 Billion PASS 0 No errors
TEST RUN FAILED

*********************
SERIOUS ERROR SUMMARY
*********************
SERIOUS : 2009-02-22 16:07:31, RAM, SERIOUS: Error verifying data in RAM (x 1)

It was great to get a simple resolution to the problem, bad memory?
With a scheduled maintenance replacement I was operational again.

**************
RESULT SUMMARY
**************
Test Start time: Sun Feb 22 20:34:37 2009
Test Stop time: Sun Feb 22 20:39:38 2009
Test Duration: 000h 05m 01s

Test Name Cycles Operations Result Errors Last Error
CPU - Maths 267 406 Billion PASS 0 No errors
Memory (RAM) 1 3.664 Billion PASS 0 No errors
Network: 127.0.0.1 334578 3.480 Billion PASS 0 No errors
TEST RUN PASSED

*********************
SERIOUS ERROR SUMMARY
*********************
Tagged with: Databases Linux MySQL

Related Posts

More CPUs or Newer CPUs

In a CPU-bound database workload, regardless of price, would you scale-up or scale-new? What if price was the driving factor, would you scale-up or scale-new? I am using as a baseline the first available AWS Graviton2 processor for RDS (r6g).

Read more

An Interesting Artifact with AWS RDS Aurora Storage

As part of using public datasets with my own Benchmarking Suite I wanted upsize a dataset for larger volume testing. I have always used the INFORMATION_SCHEMA.TABLES data_length and index_length columns as a sufficiently accurate measurement for actual disk space used.

Read more

How long does it take the ReadySet cache to warm up?

During my setup of benchmarking I run a quick test-sysbench script to ensure my configuration is right before running an hour+ duration test. When pointing to a Readyset cache where I have cached the 5 queries used in the sysbench test, but I have not run any execution of the SQL, throughput went up 10x in 5 seconds.

Read more