What determines authoritative information

I had need to visit a particular store in New York on Sunday on referral by a friend. I knew they had two locations. Like all tech savvy people I googled sports authority new york . I even visited the website from the link of the top result (with map). I typed in the address into my iPhone as listed by the map, why somebody hasn’t invented a means to point and click that I don’t know (it probably does exist, but finding it and knowing about it is a completely more complex problem). I even clicked on the map and zoomed in for nearest subway stop.

I got on the NY subway and headed into Manhattan. What resulted was me scratching my head when I could not find the intended store. In fact, the store never existed at 57 W 57th St. Even trying the phone number as per Google resulted in a no answer. Fortunately the trusty iPhone with Internet access and viewing the store locator on the official Sports Authority website enabled me to find the closest store, over a mile away.

I have often joked about the reliance on online information and the assumption of accurate information from even trusted sites. I’ve used this example previously, I know 1 mile is approximately 1.6 kilometers, and if you put in “convert 1 mile to kilometer” Google gives you an answer of “1 mile = 1.609344 kilometer”. What if that was indeed wrong, and it was 1.659344 for example.

Where are the safeguards for verifying information? Could it even be possible?
The benefit of information available readily does not equate to good information, indeed today searching for something can provide too much information and not exactly what you are seeking.

One wonders!

Tagged with: Google Web

Related Posts

Readyset QueryPilot Announcement

At the MySQL and Heatwave Summit 2025 today, Readyset announced a new data systems architecture pattern named Readyset QueryPilot . This architecture which can front a MySQL or PostgreSQL database infrastructure, combines the enterprise-grade ProxySQL and Readyset caching with intelligent query monitoring and routing to help support applications scale and produce more predictable results with varied workloads.

Read more

More CPUs or Newer CPUs

In a CPU-bound database workload, regardless of price, would you scale-up or scale-new? What if price was the driving factor, would you scale-up or scale-new? I am using as a baseline the first available AWS Graviton2 processor for RDS (r6g).

Read more

An Interesting Artifact with AWS RDS Aurora Storage

As part of using public datasets with my own Benchmarking Suite I wanted upsize a dataset for larger volume testing. I have always used the INFORMATION_SCHEMA.TABLES data_length and index_length columns as a sufficiently accurate measurement for actual disk space used.

Read more