FCC wants to make Do Not Call list permanent

Conformity Magazine pointed me to this December 4th notice from the FCC. The commission would like to eliminate the automatic expiration for the Do Not Call registry. I think every US citizen who is not employed in the telemarketing industry would like this rule change to go through.

The statement of Chairman Kevin J. Martin sums up the commission’s decision:

Today’s action tentatively concludes that telephone numbers registered in the National Do-Not- Call Registry will not expire after 5 years. The Commission continues to move forward to protect consumers who have registered their telephone numbers on the Do-Not-Call list. Consumers expect their telephone numbers to remain protected under the Do-Not-Call list until they have cancelled their registration or their telephone number is disconnected or reassigned.

Instant Media is dead, long live Miro

My regular reader 😉 may remember that I’ve recommended Instant Media (I’M) as an alternative to Joost in previous posts. Over the last 6 months or so I basically only used it for automatic downloading of DL.TV and Cranky Geeks (linked under Netcasts on the right). While I had noticed that the I’M guide wasn’t working back in September I hadn’t bothered to find out why. This week I looked around and found some information in these blog posts.

Instant Media Gone Bust? Feeling the Web Video Bubble Burst

Instant Media, Miro Competitor, Leaves The Net Without A Trace

Instant Media Grinds to a Halt

Those posts speculate on what happened and the last one linked above does have a fairly definitive answer from one the the former developers.

Scott Blum, the eccentric billionaire that was funding our company, decided to scuttle it mid-July

This was a little puzzling, why did the I’M web site stay online until September when the plug had been pulled in July. This blog post gives me an idea of why, the company tried to sue Microsoft and get a preliminary injunction over Microsoft’s use of their trademark, I’M. That seems like the reason to me, I’M had to stay up on the web until the court decided, once the courts ruled against I’M in the middle of August I’M had no further incentive to stay around.

Since I’M was gone I decided to look around for an alternative and I found a great one, Miro. This is an excellent program especially since it’s open source and cross platform. I installed it on my OpenSuse 10.3 PC and setup a Samba share so that my Media Center PC can play the videos. One feature I hope to make good use of once NerdTV starts season two is Miro’s support for Bittorrent. I like the idea of be able to easily to give some of my bandwidth to NerdTV to help defray the distribution costs (PBS doesn’t have very deep pockets).

Processing my digital photos part 1

Over the past month I’ve been revising the work flow for handling my digital photos. With the purchase of my first digital camera back in 2000 I soon realized the need to develop a methodology that kept my photos safe while being easy to find and backup. Keeping the flow simple was important to ensure that I would keep using it for years to come without needing major revisions.

Although I’ve been trying for forty years now, I have never become a really good photographer. The advent of digital photography and scanning of old film and slides to digital formats has been a lifesaver. I can, and usually need to, retouch my photos without spending hours per image in the darkroom. With the easy retouch capability of today it is all too tempting to simply fix the original photos and just save it. This is tempting for its simplicity but my experience has shown that with any original data I regret this choice later on. Once you have overwritten original data you can’t go back so, for all types of digital data I enforce a policy on myself of only modifying copies never the original.

The next consideration is the compression used in many digital image file formats. It is not uncommon for me to go through many retouch iterations before I am satisfied with the result. Being all too familiar with the way PC’s tend to crash at the worst possible time, I like to save my work frequently while working on images. To prevent the loss of image quality I prefer to use a loss-less file format for images while editing and then export to JPEG after I’m finished.

These considerations led me to setting up the first part of my workflow back in 2000. When I add images to my collection, I start by creating a subdirectory under my main image directory. This directory is named with the original date of the images using a year-month-day format of YYYY-MM-DD. I then create two subdirectories below the dated directory, one named Originals and the other named JPEG. The directory structure looks like this:

D:My Pictures
              2007-12-25
                          JPEG
                          Originals
              2008-01-08
                          JPEG
                          Originals

Now I copy the images to the Originals directory and set the files to read-only using normal file management tools. By setting the file attribute most programs will automatically prohibit overwriting the original image. The few programs I regularly use that will overwrite read-only files at least give a warning message when I attempt to overwrite the file.

That’s just about all I want to write up for part one of this series of posts. The final thing to cover is how I’ve automated the directory creation process. It only took a few weeks of using this procedure back in 2000 until I tired of manually creating directories. First I created a simple batch file but it was a messy solution so I whipped up a simple VB application. The app prompted for the date name with the current date preset but editable. Then simply clicking OK created the date named directory and the two subdirectories underneath it. I used that app for many years until I started using the open source, GPL licensed, AutoHotkey scripting language.

Here’s my current directory creating script:

; Create dated directory structure for images
; By Paul Hutchinson released to the public domain 2008
; Revised 1/9/2008

#SingleInstance ignore
#NoTrayIcon

; Assign the full path to the root of your picture directory to RootDir.
; Be sure to include the trailing backslash!
; e.g. C:Documents and SettingsusernameMy DocumentsMy Pictures
RootDir = D:My Pictures

InputBox, DirectoryDate, Create new pictures directory, Enter the date to use for the directory name (YYYY-MM-DD), , 375, 125, , , , , %A_YYYY%-%A_MM%-%A_DD%

if ErrorLevel ;User pressed cancel
ExitApp

; Create the new directory and subdirectories
FileCreateDir, %RootDir%%DirectoryDate%
FileCreateDir, %RootDir%%DirectoryDate%Originals
FileCreateDir, %RootDir%%DirectoryDate%JPEG

ExitApp

A couple of reasons why the USA is losing its leadership role in science and technology

Sigh,

Poll finds more Americans believe in devil than Darwin Reuters (emphasis mine)

It is the latest survey to highlight America’s deep level of religiosity, a cultural trait that sets it apart from much of the developed world.

It also helps explain many of its political battles which Europeans find bewildering, such as efforts to have “Intelligent Design” theory — which holds life is too complex to have evolved by chance — taught in schools alongside evolution.

It sadly also points out the way journalists don’t get science, see the bolded text, this reporter doesn’t understand ID and evolution. ID is non-science that says God did it, and both ID and evolutionary science say that life is too complex to have evolved by chance alone. Repeat after me, natural selection is NOT a random process. A big difference is that evolutionary science says we don’t know all the details of how life evolved but we’ll keep working on the missing bits, ID says if it is difficult to figure out right now, just say the designer did it. Science makes useful predictions about the natural world, ID wants people to stop trying to understand the hard parts of the natural world and just say God did it.

NeuroLogica Blog » Intelligent Design Fight Brewing in Texas

Andy Grove comparing apples and oranges

This post at Pharyngula today is a reply to an interview with Andy Grove over at Newsweek. A commenter over at Pharyngula, Ashutosh also pointed me to a good reply by Derek Lowe.

This is a clear case of an apples and oranges comparison by Dr. Grove. In addition to the points raised by PZ and Derek I’d add the reliability question. It is one thing to design a semiconductor that can have severe problems without harming people and designing drugs that cure the sick without harming them. I mean there is no reset button on a human to give us a clean retry after the new drug crashes the humans life. Any engineer who thinks developing safe and effective drugs can be improved as easily as semiconductor processes is either lacking an understanding of biology and pharmacology or fooling themselves about what they know. Dr. Grove you are embarrassing to a rational thinking engineer like myself. Keep in mind Expertise is real and it matters and it sure seems that chemical engineer Andy Grove has no expertise in pharmacology, biology or medicine.

UPDATES: Oops, I needed to change Mr. to Dr. because he has a doctorate in chemical engineering.

Tyler raises some good points from a CS/Math perspective.

The Slashdot post has some interesting comments.

The Pharma Marketing Blog has a post with an industry insider perspective.

Most pertinent is the Variable Fragment blog’s post, the writer has worked in process development at Intel and now works in biotechnology. What more can you ask for, here’s someone with expertise in both areas.

Finally for silliest comment the award goes to TechNudge.net for this bit of ridiculous commentary.

Andy, maybe if researchers would stop looking for non-existent proof of serious man-made global warming they’d have a few minutes to get on with diseases. But that’s not where the funding is.

I mean really, this fool thinks there is but one type of research and researcher so, just re-task them from climate science to pharmacology. I guess he’d like to call on the Geek Squad for all his medical needs too.

Eye-Fi – WiFi for digital cameras

EETimes has put up a couple of articles on the Eye-Fi SD card in the past week, you can read them here and here.

This device is a combination 2Gig SD card and WiFi adapter that turns any camera into a WiFi enabled camera. The scenario mentioned in the articles of sharing photos at a family get together sounds like a great application. However lets think more deeply about it, the scenario presented mentions technical glitches preventing the sharing of the photos on the day. I think throwing more technology at the problem is just as likely to cause technical problems. This is especially true since you can’t change the cards WiFi settings when it is inside a camera. Now if the person hosting the party had a bunch of these cards pre-programmed and tested for their WiFi network then gave them to people to use at the party it would probably work great. But at $100 a piece having a bunch of these things around for a party isn’t economical for most people.

I think a better plan would be to have a PC setup with a multi-format card reader installed. Then be willing to help the party attendees use the card reader to put the photos on the PC for sharing.

Add to this the possibility of a radio transmitter mounted within inches of a sensitive CCD in a camera causing noise in the CCD and the usefulness could be negated. In a few years if camera makers design and test for compatibility with the Eye-Fi this potential problem should go away.

Eye-Fi » Where to Buy

Seagate settles bogus class action

I heard of this news from Computerworld via Slashdot via Greg Laden. This is a case where the only winners will be the lawyers who filed the class action lawsuit. Basically Seagate was sued because they use the industry standard units of measurement that follow SI rules.

This is actually the third time the data storage industry has been hit with class action suits over units of measurement. The problems started when computer guys got past the 1000 mark for the various units of measurement in computers. The computer industry decided on their own to take the SI prefixes , re-define them and use the re-defined prefixes for their units of measurement. While this did create confusion it wasn’t too bad when only kilo was commonly used because 1000 and 1024 are only 2.4% different. Once we started to frequently use the re-defined mega and giga prefixes things became worse (mega 4.9% and giga 7.4% differences). The IEEE recognized this was going to get rapidly worse (tera is 10% different) so in the late 90’s they introduced IEEE 1541 to provide unambiguous prefixes for the binary units of measurement. Soon other standards bodies (IEC, CENELEC, CIPM, NIST, and SAE) endorsed the new binary units and that should have been the end of the troubles.

The first class action suit was against flash media manufacturers:

On February 20, 2004, Willem Vroegh filed a lawsuit against Lexar Media, Dane-Elec Memory, Fuji Photo Film USA, Eastman Kodak Company, Kingston Technology Company, Inc., Memorex Products, Inc.; PNY Technologies Inc., SanDisk Corporation, Verbatim Corporation, and Viking InterWorks alleging that their descriptions of the capacity of their flash memory cards were false and misleading.

The manufacturers agreed to clarify the flash memory card capacity on the packaging and web sites.[1] The consumers could apply for “a discount of ten percent off a future online purchase from Defendants’ Online Stores Flash Memory Device”.[52]

The law firms Gutride Safier, LLP and Milberg Weiss received $2.4 million.

The next year Western Digital was hit and here’s some interesting information from the settlement agreement.

They paid $500,000 in fees and expenses to San Francisco lawyers Adam Gutride and Seth Safier, who filed the suit.

The lawyers get the lions share of the cash, what’s new 😦

Surely Western Digital cannot be blamed for how software companies use the term “gigabyte” a binary usage which, according to Plaintiff’s complaint, ignores both the historical meaning of the term and the teachings of the industry standards bodies. In describing its HDD’s, Western Digital uses the term properly. Western Digital cannot be expected to reform the software industry.[2] Furthermore, there is no conceivable reason that consumers would perceive the size of Western Digital’s HDD’s as different in any respect from the size of other HDD’s on the market. All major HDD manufacturers offer HDD’s in the same, industry standard, decimal-defined storage capacities (e.g., 80 GB, 120 GB, 250 GB). Thus, a consumer buying an HDD is comparing decimal gigabytes from one HDD company to decimal gigabytes from another HDD company regardless of what software companies may be doing. Accordingly, price and reliability, not storage capacity nomenclature, are determinative of a customer’s decision to purchase from one HDD manufacturer over another. In short, Plaintiff’s claims are merit less.

I personally think someone should file suit against Microsoft for still using the re-defined unit prefixes in their OS’s. I guess the lawyers who put these suits together realize that they’d have a really hard time going after the big $$$ company that is actually abusing the standards so, they go after the smaller guys who are actually following the SI standard.

Notwithstanding the fecklessness of Plaintiff’s claims, Western Digital recognizes not only the inherent risks and uncertainties of litigation, but also that the litigation would be an undesirable distraction and would require significant investment in employee time, attorneys’ fees, and costs. Taking into account these considerations and Western Digital’s desire to put this matter to rest, Western Digital believes that settlement on the terms set forth in the Settlement Agreement is in its best interests.

Par for the course in a US civil case, it usually is far cheaper to settle than to go to trial and prove you are right. 😦

[2] Apparently, Plaintiff believes that he could sue an egg company for fraud for labeling a carton of 12 eggs a “dozen”, because some bakers would view a “dozen” as including 13 items.

I love this footnote they stuck in the agreement, watch out egg producing industries they can get you next. 😉

As to this recent case, Seagate had to settle to limit how much cash they loose. From the Settlement Agreement :

Awards attorney’s fees of up to $1.8 million.

Yep the lawyers are the big winners again. 😦 I wish that by simply following standards manufacturers could avoid these class action suits but sadly that is not the case in the USA.

A real test of super speaker cables, maybe not

I read in Swift that one of the outrageously expensive sets of speaker wires where going to submit to a real test. There is no rocket science involved in determining if a person can hear a difference between audio products. The ABX Double Blind Comparator System isn’t exactly new technology and when used in a properly controlled test yields excellent results. The problem is that most manufacturers don’t seem to want to do good tests , instead they depend on reviewers and not necessarily applicable technical measurements.

The next week I read about more developments in the process and it was looking like the people making the claim for the big money cable where backing out. However the next section of Swift gave me hope this would go forward. Randi had done something I hadn’t seen before, he changed the wording of his challenge rules to address the complaints of the reviewer, Michael Fremer.

To those readers who are unfamiliar with the JREF challenge here’s a few important points about it. People often make claims for things that have no plausible scientific explanation. The JREF has put up 1 million dollars US as a prize for any person who has made such a claim, has gotten the claim publicly known via the media and can demonstrate the affect to the JREF. The claimant doesn’t have to explain how anything works all they have to do is show that it works. Both the claimant and the JREF have to agree ahead of time on a test that demonstrates the claim. If the claimant passes the agreed upon test the JREF hands over the prize.

For the claims made by this audio reviewer this should be a very simple and straight forward test. The claim is that the reviewer can reliably tell the difference between the ultra-expensive Pear speaker wire and normally priced speaker wire. A simple controlled double blind listening test will be all that is needed to decide the matter. So if the reviewer and manufacturer are truly sincere about their extraordinary claim they will now go ahead and start discussing a simple test.

Sadly this post, BLAKE WITHDRAWS, has just gone up at the JREF. The manufacturer is pulling out before even hammering out a simple test procedure. This says to me that the manufacturer isn’t all that certain of their claim.

Some more reading about audio cables:
The Truth About Cables – AxiomAudio
Interconnect and speaker cable whitepaper
Speaker Cables from Blue Jeans Cable

From Audioholics Home Theater Reviews and News:
Un-Sound Advice About Cables
Top Ten Signs an Audio Cable Vendor is Selling You Snake Oil
AudioQuest Responds to Top 10 Snake Oil Article
Thiel Audio Interview on Cables
Cable Distortion and Dielectric Biasing Debunked
Skin Effect Relevance in Speaker Cables
Speaker Cable Face Off 1
Speaker Cable Reviews – Faceoff 2
Speaker Cable Faceoff 3

Why we need net neutrality

Two posts at the Electronic Frontier Foundation (EFF) appear to confirm what many have suspected.

EFF tests agree with AP: Comcast is forging packets to interfere with user traffic

Comcast keeps telling its users that the problems they’re seeing are not its fault. It’s time for Comcast to come clean about what it’s doing and take its users’ reports seriously.

Comcast is also Jamming Gnutella (and Lotus Notes?)

When an ISP starts arbitrarily zapping some of the protocols that its customers use, they instantly endanger the cascade of innovation that the Internet has enabled. Before this kind of traffic jamming, anybody — huge businesses, small start-ups, college students and children in their bedrooms — could build new, innovative protocols on top of the Internet’s TCP/IP platform.

If this type of conduct is allowed to continue, many innovators will have to get active assistance from an ISP in order to have their protocols allowed through the ISP’s web of spoofing and forgery. Technologies like BitTorrent and Joost, which are used to distribute licensed movies and are in direct competition with Comcast’s cable TV services, will be at Comcast’s mercy.

It should also be remembered that in many parts of the United States, Comcast is a duopoly or even a monopoly provider of broadband Internet access. Competition might offer some protection against packet-forging ISPs, but under current market conditions, we can’t depend on it.

The last paragraph is the big problem here, with most citizens having little or no choice in ISP’s I think we need Network neutrality in the United States. If we ever get to a point in the US where most citizens have three or more choices in provider then it won’t matter if one or even two ISP’s are interfering with their customers usage.

Cell Phone Jammer Foolishness

Last weeks edition of The McLaughlin Group had the stupidest debate ever.

Chatter Zapper.

Is it a new wave of technology, or is it an anti-wave? They’re called cell phone jammers, capable of voiding any conversation within 20 feet. This combative technology has been called “revenge tech” or “design noir” or “annoyance tech.”

We’ve all been there. You’re sitting on a sold-out train, a crowded bus. It starts with a cell phone ring, some zany, cacophonous sound. Then the person sitting next to you picks up her cell phone. The agony begins; first the retelling of her day, then it is a round of “He said, she said,” then what’s for dinner.

Unobtrusively you reach over and take out your “revenge tech” device — zap. That takes care of that.

I think it’s a pseudo-problem. It’s a pseudo- problem, because technology will now devise a jam-proof telephone or the chatter will not work.
-John McLaughlin

The reason I call this the stupidest debate ever on the show is that jammers are illegal, period. They have been illegal my whole lifetime and they will remain illegal as long as humans want to have usable radio technology. This is not a new technology, as long as there has been radio there has been radio jamming technology. They are confusing a new product with a new technology and ignoring the reality of the FCC rules.

In case you don’t think jammers will get you into deep trouble, here’s the FCC penalty.

Fines for a first offense can range as high as $11,000 for each violation or imprisonment for up to one year, and the device used may also be seized and forfeited to the U.S. government.

For more information and the rules for other countries see this Wikipedia article.

**** UPDATE ****

More information at this newer post