Heroes

I’m not one for hero worship. I admire some people greatly, but I am not to the point of ignoring their faults, or putting them on a pedestal.

One individual who at least has a leg up on that pedestal is Sir Ernest Shackleton, the polar explorer.

I first came across Shackleton’s name reading Roland Huntford’s controversial and iconoclastic account of the expeditions to reach the South Pole, Scott and Amundsen, also published as The Last Place on Earth. In his book, Huntford exploded the myth of the heroic Scott losing the race to a somehow less-than-heroic Amundsen. It’s a book well worth reading, and was made into a mini-series for PBS.

Huntford also wrote a definitive biography of Shackleton. This book is another that repays reading.

Shackleton was a man of his time, with the same prejudices of his contemporaries. These can be a real thumb in the eye, when one comes across them. On the flip side, he was a man who seemed to have been born a hundred years too late — and sometimes felt that way, himself. He was a useless git as a husband, employee, friend. The only times in his life he was happy were when he was off in polar regions, exploring. It’s cliche, but his life is one from which the cliche was made.

Shackleton was the only polar explorer who always brought his team back alive. He’s noted for two events. He came within 100 miles of the South Pole. And, he made the greatest open ocean sailing journey in known history. After his ship, the Endurance, was trapped in Antarctic sea ice and crushed, Shackleton and his 27-man crew were stranded on floe ice, from which they rowed in lifeboats to Elephant Island, after the ice broke up. Without any other hope of rescue, Shackleton and five other men sailed a 22.5-foot lifeboat from Elephant Island to South Georgia Island, 800 nautical miles (1,500 km; 920 mi), over a stormy winter ocean with seas up to 60 feet. Reaching the southwest side of the island, they then hiked over the island’s mountain range (at that time, unmapped) to reach a whaling station on the North side of the island. From the whaling station, rescue missions were arranged and all the men on Elephant Island were rescued.

The South Georgia boat party could expect to meet hurricane force winds and waves — the notorious Cape Horn Rollers — measuring from trough to crest as much as 60 feet (18 m). Shackleton therefore selected the heaviest and strongest of the three boats, … It had been built as a whaleboat in London to Worsley’s orders, designed on the “double-ended” principle pioneered by Norwegian shipbuilder Colin Archer. Shackleton asked the expedition’s carpenter, Harry McNish, if he could make the vessel more seaworthy. Using improvised tools and materials, McNish raised the boat’s sides and built a makeshift deck of wood and canvas, sealing his work with oil paints, lamp wick, and seal blood. The craft was strengthened by having the mast of the Dudley Docker lashed inside, along the length of her keel. She was then fitted as a ketch, with a mainmast and a mizzenmast, rigged to carry lugsails and a jib. The weight of the boat was increased by the addition of approximately 1 long ton (1,016 kg) of ballast, to lessen the risk of capsizing in the high seas that Shackleton knew they would encounter.1

He was a phenomenal leader, whose expeditions “failed” in the traditional sense. Aside from being “first” to get within 100 miles of the pole, he was never the one to set a definitive record. And yet, he was heroic in his determination to protect the lives and health of the men serving under him. He dedicated himself to a bizarre, dangerous, and ultimately futile “career” as an explorer. He’s a footnote in history, now; but, a man we could well look to as an example of real leadership.

Reference

  • Huntford, Roland. Shackleton. New York: Carroll & Graf. 1985.
  • Huntford, Roland. The Last Place on Earth: Scott and Amundsen and the Race to the South Pole. New York: Modern Library Exploration. 1999. ISBN 978-0-375-75474-6.
  • Worsley, F. A. Shackleton’s Boat Journey. London: Pimlico. 1940, repr. 1999. ISBN 0-7126-6574-9.
  • Alexander, Caroline. The Endurance: Shackleton’s Legendary Antarctic Expedition. London: Bloomsbury Publishing. 1999. ISBN 0-7475-4670-3.

Resources

Chasing Shackleton. 3-part PBS documentary, in which an explorer attempts to recreate Shackleton’s boat journey. January 2014.

Shackleton. Biopic about Shackleton’s life, with Kenneth Branagh in the title role. 2002.

The Last Place on Earth. Mini-series about the Scott and Amundsen expeditions. Originally filmed in 1985, released on DVD in 2011.

I Want Me Some French

I don’t have a Jones for the French, but as a people and as a country, they’re some tough hommes et femmes. The French have much to answer for, with respect to Algeria, Mali, Rwanda, and other parts of central and west Africa in which they were ruthless colonialists.

But, they’re on the right side of the moral line, now. This week, French commandos were in Mali, helping roust and capture Islamist militants who had taken over a hotel with 170 people in it. Not too long ago, when the same groups of militants seized the entire northern part of the country, the French went in and routed them, and gave control of the country back to the legitimate government.

The administration of Bill Clinton unambiguously shares responsibility for the genocide in Rwanda in 1994. Not only did the President and members of his Cabinet know about the genocide, they actively blocked attempts by the UN Security Council to intervene and stop it. Between 50% and 70% of the Tutsi population of Rwanda were hacked, shot, and burned to death, over a 3-month period, with full knowledge of the American government.

The French, whose role in that period remains controversial (Rwanda is a former French colony, the French gov’t publicly backed the Hutu-led government, and armed its military forces), nonetheless were the only western nation to actively intervene to stop the genocide. The only one.

The French told George Bush to kiss their derrières when he wanted their help invading Iraq, and were roundly condemned by many denizens of Gutlesswankistan. Their position in that instance (again) turned out to be the morally correct one. And now, even after the attacks in January, and last week, the French remain committed to taking in 30,000 refugees — three times the number being accepted by the Land of Weak-in-the-Knees. Again, making the morally correct choice.

Neither the French nor the Americans have more than a foot on the moral high ground. We’re all the beneficiaries of some nasty and immoral actions by both our ancestors, and by our present day governments.

But the French can claim one whole leg up on Americans, in refusing to be cowed by the terrorists at home. They aren’t running for the bomb shelters, turning away women and children at the borders, out of sweating fear. And, they can claim another leg up on Americans, abroad. French soldiers are on the ground, putting their lives at risk, to help other countries in the fight against terrorists.

Those who clamor against allowing any Syrians — men, boys, girls, women — into the country, should walk it back, and take a look at the French. Right now, your “freedom fries” are look downright limp.  I’ll take mine French.

The Abandonment of the Jews — and All Other Refugees

Wyman, David S. Introduction by Elie Wiesel. The Abandonment of the Jews: America and the Holocaust, 1941-1945. New York: The New Press. 1998. ISBN 1-56584-415-7.

This passage is excerpted from Dr. Wyman’s Preface to his book.

In summary, then, these are the findings I find most significant:

  1. The American State Department and the British Foreign Office had no intention of rescuing large numbers of European Jews. … their policies aimed at obstructing rescue possibilities and dampening public pressures for government action.

  2. Authenticated information that the Nazis were systematically exterminating European Jewry was made public in the United States in November 1942. President Roosevelt did nothing about the mass murder for fourteen months, then moved only because he was confronted with political pressures he could not avoid and because his administration stood on the brink of a nasty scandal over its rescue policies.

  3. The War Refugee Board, which the President then established to save Jews and other victims of the Nazis, received little power, almost no cooperation from Roosevelt or his administration, and grossly inadequate funding. (Contributions from Jewish organizations, which were necessarily limited, covered 90 percent of the WRB’s costs.) WRB managed to help save approximately 200,000 Jews and 20,000 non-Jews.

  4. Because of State Department administrative policies, only 21,000 refugees were allowed to enter the United States during the three and one-half years the nation was at war with Germany. That amounted to 10 percent of the number who could have been legally admitted under the immigration quotas during that period.

  5. Strong popular pressure for action would have brought a much fuller government commitment to rescue and would have produced it sooner.

  6. American Jewish leaders worked to publicize the European Jewish situation and pressed for government rescue steps.

  7. In 1944, the United States War Department rejected several appeals to bomb the Auschwitz gas chambers and railroads leading to Auschwitz, claiming that such actions would divert essential air-power from decisive operations elsewhere. Yet, in the very months that it was turning down the pleas, numerous massive American bombing raids were taking place within fifty miles of Auschwitz. Twice during that time, large fleets of American heavy bombers struck industrial targets in the Auschwitz complex itself, not five miles from the gas chambers.

  8. … The record also reveals that the reasons repeatedly invoked by government officials for not being able to rescue Jews could be put aside when it came to other Europeans who needed help.

  9. Franklin Roosevelt’s indifference to so momentous an historical event as the systematic annihilation of European Jewry emerges as the worst failure of his presidency.

  10. Poor though it was, the American rescue record was better than that of Great Britain, Russia, or the other Allied nations.

At the end of the preface, Dr. Wyman asks:

Would the reaction be different today? Would Americans be more sensitive, less self-centered, more willing to make sacrifices, less afraid of differences now than they were then?

I think we all know that the answer is, “No.” Even the people now advocating turning away from refugees admit their own positions in this matter. They don’t care.

Time Never Has a Stop

The National Institute of Standards and Technology (NIST) maintains the time standard for the United States. It operates a collection of clocks and measuring tools that together help determine UTC — known internationally as UTC(NIST).

These tools are located at a laboratory in Boulder, CO. In addition to the laboratory, NIST uses a nearby radio station, WWVB, to broadcast a time signal relayed to it from the laboratory. A radio-controlled clock (RCC) has a tiny listener that can pick up this signal and translate it into the current time according to the atomic clock; the firmware inside the clock mechanism then uses that information to adjust the clock to match the signaled time.

I have an RCC, have had it for quite a few years. As is typical for these clocks, besides keeping accurate time, the clock also captures internal and external temperatures, and relative humidity. Unfortunately, enough years have passed that my clock’s external temperature monitoring unit has bit it. I just ordered a new, improved model of RCC. I don’t have a real need for a clock that is accurate to ±0.5 second per day. I just like the idea.

The current model synchronizes the time only once, at 0200. If the sync fails, then it waits 24 hours for the next sync, during which period it will lose/gain as much as 0.5 sec — OR MORE!! The clock time would then be as much as 1 second off. OR MORE!! The new model will sync every hour, 0000 to 0600, until it is successful, or fails at 0600.

Among other things, the RCC demonstrates how artificial our computation of “time” has become. We often associate UTC, or Greenwich Mean Time (GMT), with scientific and technical measurements. But UTC is actually a kludge — an average of times from 70 clock laboratories around the world. UTC exists to maintain a clock that is timed with the Earth’s rotation. This requires that it periodically have leap seconds shoved into its time cycle. Scientific time is maintained as a separate clock that is not munged in this way.

“The oscillator found inside an RCC is based on the mechanical vibrations of a quartz crystal, typically counting 32,768 vibrations of the crystal to mark one second.” 1

A mere 32,768 vibrations per second! Not nearly good enough!

“… the second is defined internationally as the duration of 9,192,631,770 energy transitions of a cesium atom.” 2

Bwahaha! Now you’re talking accuracy!

Marcus du Sautoy has done a 3-part documentary, Precision: The Measure of All Things, for BBC Four, it’s quite good. Part one is on time and distance.

Oh, BTW, the international standard is maintained by the French, the Bureau International des Poids et Mesures. 3

References

Time-scales and the International Bureau of Weights and Measures, Elisa Felicitas Arias, Director, Time Department, International Bureau of Weights and Measures. ITU News, 2013

From now on, four PTB primary atomic clocks will contribute to UTC. Press release, 2010


  1.   How Accurate is a Radio Controlled Clock?, Michael A Lombardi, Time and Frequency Division of the National Institute of Standards and Technology, March 2010 
  2. Ibid. 
  3. And, did you know that Frenchmen first conceived of the idea of latitude and longitude, and measured them? 

Rant on Security Certificates

One leg of the internet security stool is the SSL certificate. The cert is the basis for the transport encryption that protects your data from prying hackers — well, supposedly, although there have been suggestions that the NSA could probably break current SSL. But, anyway, NSA is probably not going to steal your identity.

SSL certificates cost money. Potentially, a lot of money. The reason is, each certificate is created for one specific web server or property. When you create a cert, you specifically assign it to “www.upyours.com,” or “secure.keepyourhandsoffmystack.com” — and that cert is technically no good anywhere else. By technically, I mean that you can use it somewhere else, like “iamtoocheaptobuyacert.org,” but anyone who hits that site is going to get a certificate error. The certificate error is going to tell you that the cert name does not match the site name, and you should not go there.

The idea behind this name matching scheme is to prevent attacks in which the attacker uses a dummy certificate to trick you into going to the wrong site. As a cautious user, you would never go to a site that has a name mismatch.

Of course, you aren’t a cautious user, you’re a determined one — determined to spend money at some company that’s too cheap to buy all the certs it actually needs. And so, you click past the certificate error and get on to that wallet action.

Another common certificate error that you will encounter is the “no trusted authority” error. The second leg of the security stool is that certificates must be issued by a recognized “certificate authority,” also known as the root certificate authority. Verisign is one example of a company authorized to act as such an authority. This authority is supposed to act as guarantor that your secure connection is going to a legitimate site, and not some hacky-sack phony.

When web sites are under development, many instances of the site will be in the hands of developers, and consequently, many certs may be needed. But, they’re not needed for public or internet usage — only for private usage. So, web development environments like Visual Studio provide simple tools that can be used to create a “self-signed” certificate. This certificate allows developers to create the necessary environment for their code.

The problem arises when the development is complete and the new site is published — along with the “self-signed” certificate. For various reasons, at top of which list is “save money,” companies decide not to swap in a legitimate certificate when the site goes live, and instead continue to use the “self-signed,” illegitimate, free one.

Now, you have two legs of the stool sawn through part way. To protect users against bogus certificates, a web site’s SSL certificate should be issued to the site name, and it should be issued by a trusted root authority. To save money, possibly also from laziness, legitimate companies use SSL data encryption, but deliberately ignore the rules of putting that security in place. By so doing, they imbue users with the sense that certificate errors don’t matter, and habituate them to clicking past them without serious thought. But, the third leg of our security stool is the willingness of users to attend to, and take seriously, certificate errors.

It’s not the 99% of the times that the landing site is legitimate, it’s the 1% when it’s not, that will do damage.

I really get irked by these certificate errors. Comcast is one such sinner — its subdomain activate.comcast.com uses a self-signed certificate. Another is my bank, Mutual Security Credit Union. It recently farmed out its online account management tools to a third party, but still continues to use its mscu.net certificate, even though the site is now at netteller.com. It’s BS, and I don’t see an end to it, unless somebody really gets burned by one of these illegitimate certs. Isn’t that the way of it?

Books — Still With Us for the Long Haul

Funny, but dead-tree books have become passé in the general consciousness, but they’re quite useful for some things that ebooks likely will never be useful for.

I first got onto the idea of ebooks as a way of compactly carrying my technical books with me on business trips. I thought that the electronic or digital format would lend itself to efficient searching, and marking text, as well.

Sadly, ebooks are a gigantic fail as tools for research. This failure is the result of publishers’ whack-doodle fixation on controlling “content,” and not the result of the medium itself. As anybody who reads ebooks knows, you can’t copy text out of the books. You can’t print text out of them, either. The search functions are so primitive as to be laughable, it they weren’t revelatory of the laziness of the reader designers. E-readers all seem to have been designed by people who don’t read. Books, at any rate.

Almost all technical books that come in digital format have horribly inadequate access to their diagrams, charts, and related visual elements. Some implementations are downright monstrous. Publishers of a programming book with samples of code will have made no adjustments so that the code samples will properly display on the digital screen. As a result, they’ll be broken into multiple lines, sometimes of only two or three words each, and be nearly unreadable. In other cases, charts will be rendered as such tiny images they really are unreadable.

But, real books sometimes don’t fare any better. Poetry is particularly susceptible to the hammering indifference of publishers to “minor” issues like line length.

I think it’s a dirty dog shame that a huge opportunity to open up books to whole new markets has been binned because of publishers’ obsessions about “unauthorized use.” Whole forests have been felled to produce millions of tons of crap books that no one reads ten years after they hit the shelves, but which will be with us for a century, or maybe two, longer — bulking up landfills everywhere.

Not just nostalgia keeps people at the paper versions. The technically lousy way in which digital books are implemented assures a future for paper books.

The IEEE Ranking of Programming Languages – July 2015

The IEEE ranks these programming languages by how much general usage they are getting. Their methodology combines information about the activity of the language’s user base in social media, the number of open source projects using the language, the demand for programmers in the language and their salaries, and so forth. It’s a bit of a heat map, I guess. What’s hot — where the jobs are, where the money is, and where the enthusiasm is highest. Or, lowest. Not meant to be a measure of value.

The big application workhorses are still smoking hot. Assembly is hotter than Perl! Haha! I’m interested to see Go high up the list, at 13. I was just reading up on it last week. Some friends are missing altogether — Dart, Groovy, Elixir. Maybe, next year. HTML — that’s so old skool! Not even top-20 anymore.

1.Java
2.C
3.C++
4.Python
5.C#
6.R
7.PHP
8.JavaScript
9.Ruby
10.Matlab
11.SQL
12.Shell
13.Go
14.Assembly
15.Perl
16.Swift
17.Visual Basic
18.Arduino
19.Scala
20.Objective-C
21.HTML
22.Processing
23.Cuda
24.Lua
25.D
26.SAS
27.Haskell
28.Delphi
29.Fortran
30.Lisp
31.VHDL
32.Ada
33.Rust
34.Clojure
35.LabView
36.Erlang
37.Verilog
38.Prolog
39.Ladder Logic
40.Julia
41.ABAP
42.Cobol
43.Scheme
44.TCL
45.Forth
46.J
47.Actionscript
48.Ocaml

Dead or Deadly — The Wanked-up Web World

I’m down to my last neuron with the “world wide web.” We’re at the mercy of developer-driven technology — “ooh, look at the cool widget,” let’s deploy that.

Really, unbelievable to me how little usability matters for modern enterprise web sites. I was just over at our utility company’s site to pay the bill … I can’t even go over all the things that are wrong with that site, I’ll become so enraged I’ll kill myself.

A huge, and I mean huge, 1/3 of the window, animated HTML5 banner across the top of the page, scrolling, and in tiny letters, maybe 14 pt type at the top, the link for “My Account.” Hahaha! When I run the mouse over the banner, it triggers a pop-out that covers up the link I’m going for! What! What! Who the — designed that piece of crap and who thought it was a good idea to pop up promos that prevent people from getting to their account page?

God in Heaven, have mercy on those fools. If I ever were to meet them, I wouldn’t.

This particular example is just that — particular and an example. I have a “business class” Comcast VOIP and network connection, here in my home office. A test at the internet speed test just reported a download speed of 17.2 Mbps. I just downloaded a 1 GB zip file in 10:18. That’s not super fast, but I am not complaining about it. I’m at home, not on a corporate pipe. And, I’m connecting through a VPN server in New Jersey, so there’s some overhead there.

So given that I can download a 1 GB file in a little over 10 minutes, why am I waiting, and waiting, and waiting, for that web page to load? Yes, the modern web page is bloated, and consists of maybe 2 or 5 MB of data. Yet, why is the status line reading “waiting for cdn.somefuckedupsitesomewhere…”? Weren’t Content Delivery Networks supposed to speed up the page load?

Why can I not read a web page as it loads? Oh, welcome to the wanked-up world of asynchronous page loads. The genius idea with async loading is that you can have parts of the page ready and operational while other parts are still being pulled down from the server. I don’t criticize this idea … but … when the browser doesn’t know ahead of time how much page real estate to assign to the downloaded object, it resizes the page on the fly, once the object is available to be presented.

Hahaha! This means my displayed content is jumping all over the place as varying portions of the page are resized! I’m half-way through that first paragraph of the news article, when the gigantic video header for the page is popped into the display, pushing down all the content below it. Now, the paragraph I was reading is clear off the bottom of the screen! Oh, hit the spacebar, the page scrolls down, and the content reappears. Okay, now searching for the last line I was reading, and … the sidebar loads, so the browser shifts all the content to the left and restructures the paragraph line lengths! Good one! You almost had me — I almost was able to read the available content before the page had completely loaded. Almost.

I am led to the question — what is the point of the technology? Weren’t we on a mission to make the web better, more usable? The Wanked-up Web World started life as a tool for efficient information exchange across distance. Through no fault of its own, it turned into a “marketplace,” a primarily commercial enterprise in which the information is secondary to the presentation. It’s become a gigantic television commercial, from which there is no escape.

If you go back to look at some old web site pages from 10 years ago, they’re actually usable presentations of information. Yeah, they were mighty plain looking, not jammed up with images, CSS styling, and uber-cool fonts. Some people — okay, many people — hurled common sense down the toilet like Friday night’s beer, with ridiculous design decisions, like blinking text. But, yesteryear’s mechanisms for gobbing up a web site are quaint, compared to the sophisticated tools of user torment now deployed all across the netosphere. Not only has usability been thrown out with the party trash, the party itself was in celebration of having done away with usability and having deployed yet more cool widgets that have suicide hot lines lighting up across America and around the world.

Sometimes, the old ways really are better ways.