Portions of this post are taken from the publication “Industry Week”, Bloomberg View, 30 October 2017.

The Bloomberg report begins by stating: “The industrial conglomerate has lost $100 billion in market value this year as investors came to terms with the dawning reality that GE’s businesses don’t generate enough cash to support its rich dividend.”

Do you in your wildest dreams think that Jack Welch, former CEO of GE, would have produced results such as this?  I do NOT think so.  Welch “lived” with the guys on Wall Street.  These pitiful results come to us from Mr. Jeffery Immelt.  It’s also now clear that years of streamlining didn’t go far enough as challenges of dumpster-fire proportions at its power and energy divisions overshadowed what were actually pretty good third-quarter health-care and aviation numbers.  Let me mention right now that I can sound off at the results.  I retired from a GE facility—The Roper Corporation, in 2005.

The new CEO John Flannery’s pledged to divest twenty billion ($20 billion) in assets perhaps is risking another piecemeal breakup but as details leak on the divestitures and other changes Flannery’s contemplating, there’s at least a shot he could be positioning the company for something more drastic.  Now back to Immelt.

Immelt took over the top position at GE in 2001. Early attempts at changing the culture to meet Immelt’s ideas about what the corporate culture should look like were not very successful. It was during the financial crisis that he began to think differently. It seems as if his thinking followed three paths. First, get rid of the financial areas of the company because they were just a diversion to what needed to be done. Second, make GE into a company focused upon industrial goods. And, third, create a company that would tie the industrial goods to information technology so that the physical and the informational would all be of one package. The results of Immelt’s thinking are not impressive and did not position GE for company growth in the twenty-first century.

Any potential downsizing by Flannery will please investors who have viewed the digital foray as an expensive pet project of Immelt’s, but it’s sort of a weird thing to do if you still want to turn GE into a top-ten software company — as is the divestiture of the digital-facing Centricity health-care IT operations that GE is reportedly contemplating.  Perhaps a wholesale breakup of General Electric Co. isn’t such an improbable idea after all.

GE has lost one hundred billion ($100 billion) in market value this year as investors came to terms with the dawning reality that GE’s businesses don’t generate enough cash to support its rich dividend. It’s also now clear that years of streamlining didn’t go far enough as challenges of dumpster fire proportions at its power and energy divisions overshadowed what were actually pretty good third-quarter health-care and aviation numbers.

One argument against a breakup of GE was that it would detract from the breadth of expertise and resources that set the company apart in the push to make industrial machinery of all kinds run more efficiently. But now, GE’s approach to digital appears to be changing. Rather than trying to be everything for everyone, the company is refocusing digital marketing efforts on customers in its core businesses and deepening partnerships with tech giants including Microsoft Corp and Apple Inc. It hasn’t announced any financial backers yet, but that’s a possibility former CEO Jeff Immelt intimated before he departed. GE’s digital spending is a likely target of its cost-cutting push.

This downsizing will please investors who have viewed digital as an expensive pet project of Immelt’s, but it’s sort of a weird thing to do if you still want to turn GE into a top-10 software company — as is the divestiture of the digital-facing Centricity health-care IT operations that GE is reportedly contemplating.

The company is unlikely to abandon digital altogether. Industrial customers have been trained to expect data-enhanced efficiency, and GE has to offer that to be competitive. As Flannery said at GE’s Minds and Machines conference last week, “A company that just builds machines will not survive.” But if all we’re ultimately talking about here is smarter equipment, as opposed to a whole new software ecosystem, GE doesn’t necessarily need a health-care, aviation and power business.

Creating four or five mini-GEs would likely mean tax penalties.  That’s not in and of itself a reason to maintain a portfolio that’s not working. If it was, GE wouldn’t also be contemplating a sale of its transportation division. But one of GE’s flaws in the minds of investors right now is its financial complexity, and there’s something to be said for a complete rethinking of the way it’s put together. For what it’s worth, the average of JPMorgan Chase & Co. analyst Steve Tusa’s sum-of-the-parts analyses points to a twenty-dollar ($20) valuation — almost in line with GE’s closing price of $20.79 on Friday. Whatever premium the whole company once commanded over the value of its parts has been significantly weakened.

Wall Street is torn on General Electric, the one-time favorite blue chip for long-term investors, which is now facing an identity crisis and possible dividend cut. Major research shops downgraded and upgraded the industrial company following its third-quarter earnings miss this past Friday. The firm’s September quarter profits were hit by restructuring costs and weak performance from its power and oil and gas businesses. It was the company’s first earnings report under CEO John Flannery, who replaced Jeff Immelt in August. Two firms reduced their ratings for General Electric shares due to concerns about dividend cuts at its Nov. 13 analyst meeting. The company has a 4.2 percent dividend yield. General Electric shares declined 6.3 percent Monday to close at $22.32 a share after the reports. The percentage drop is the largest for the stock in six years. Its shares are down twenty-five (25%) percent year to date through Friday versus the S&P 500’s fifteen (15%) percent return.

At the end of the day, it comes down to what kind of company GE wants to be. The financial realities of a breakup might be painful, but so would years’ worth of pain in its power business as weak demand and pricing pressures drive a decline to a new normal of lower profitability. Does it really matter, then, what the growth opportunities are in aviation and health care? As head of M&A at GE, Flannery was at least partly responsible for the Alstom SA acquisition that swelled the size of the now-troubled power unit inside GE. If there really are “no sacred cows,” he has a chance to rewrite that legacy.

CONCLUSIONS:

Times are changing and GE had better change with those times or the company faces significant additional difficulties.  Direction must be left to the board of directors but it’s very obvious that accommodations to suite the present business climate are definitely in order.

Advertisements

Astrolabe

October 25, 2017


Information for the following post was taken from an article entitled “It’s Official: Earliest Known Marine Astrolabe Found in Shipwreck” by Laura Geggel, senior writer for LiveScience, 25 October 2017.

It’s amazing to me how much history is yet to be discovered, understood and transmitted to readers such as you and me.   I read a fascinating article some months ago indicating the history we do NOT know far exceeds the history we DO know.  Of course, the “winners” get to write their version of what happened.  This is as it has always been. In the great and grand scheme of things, we have artifacts and mentifacts.

ARTIFACT:

“Any object made by human beings, especially with a view to subsequent use.  A handmade object, as a tool, or the remains of one, as shard of pottery, characteristic of an earlier time or cultural stage, especially such an object found at an archaeological excavation.”

MENTIFACT:

“Mentifact (sometimes called a “psychofact”) is a term coined by Sir Julian Sorell Huxley, used together with the related terms “sociofact” and “artifact” to describe how cultural traits, such as “beliefs, values, ideas,” take on a life of their own spanning over generations, and are conceivable as objects in themselves.”

The word astrolabe is defined as follows:

The astrolabe is a very ancient astronomical computer for solving problem relating to time and position of the Sun and stars.  Several types of astrolabes have been made.  By far, the most popular type is the planispheric astrolabe, on which the celestial sphere is projected onto the plane of the equator.  A typical old astrolabe was made of brass and was approximately six (6) inches in diameter, although much larger and smaller astrolabes were also fabricated.

The subject for this post is the device shown as follows:

FIND:

More than 500 years ago, a fierce storm sank a ship carrying the earliest known marine astrolabe — a device that helped sailors navigate at sea, new research finds. Divers found the artifact in 2014, but were unsure exactly what it was at the time. Now, thanks to a 3D-imaging scanner, scientists were able to find etchings on the bronze disc that confirmed it was an astrolabe.

“It was fantastic to apply our 3D scanning technology to such an exciting project and help with the identification of such a rare and fascinating item,” Mark Williams, a professorial fellow at the Warwick Manufacturing Group at the University of Warwick, in the United Kingdom, said in a statement. Williams and his team did the scan.

 

The marine astrolabe likely dates to between 1495 and 1500, and was aboard a ship known as the Esmeralda, which sank in 1503. The Esmeralda was part of a fleet led by Portuguese explorer Vasco da Gama, the first known person to sail directly from Europe to India.

In 2014, an expedition led by Blue Water Recoveries excavated the Esmeralda shipwreck and recovered the astrolabe. But because researchers couldn’t discern any navigational markings on the almost seven (7) inch-diameter (17.5 centimeters) disc, they were cautious about labeling it without further evidence.

Now, the new scan reveals etchings around the edge of the disc, each separated by five degrees, Williams found. This detail proves it’s an astrolabe, as these markings would have helped mariners measure the height of the sun above the horizon at noon — a strategy that helped them figure out their location while at sea, Williams said.  The disc is also engraved with the Portuguese coat of arms and the personal emblem of Dom Manuel I, Portugal’s king from 1495 to1521.  “Usually we are working on engineering-related challenges, so to be able to take our expertise and transfer that to something totally different and so historically significant was a really interesting opportunity,” Williams said.

CONCLUSIONS:

The only manner in which the use of this device could be known is by three-dimensional scanning techniques.  Once again, modern technology allows for the unveiling of the truth.  The engravings indicating Portugal’s king nailed the time period.  This is a significant find and confirms early voyages throughout history.

DISTRACTIONS

October 18, 2017


Is there anyone in the United States who does NOT use our road systems on a daily basis?  Only senior citizens in medical facilities and those unfortunate enough to have health problems stay off the roads.  I have a daily commute of approximately thirty-seven (37) miles, one way, and you would not believe what I see.  Then again, maybe you would.  You’ve been there, done that, got the “T” shirt.

It’s no surprise to learn that information systems cause driver distraction, but recent news from the AAA Foundation for Traffic Safety indicated the problem may be worse than we thought. A study released by the organization showed that the majority of today’s information technologies are complex, frustrating, and maybe even dangerous to use. Working with researchers from the University of Utah, AAA analyzed the systems in thirty (30) vehicles, rating them on how much visual and cognitive demand they placed on drivers. The conclusion: None of the thirty-produced low demand. Twenty-three (23) of the systems generated “high” or “very high” demand.

“Removing eyes from the road for just two seconds doubles the risk for a crash,” AAA wrote in a press release. “With one in three adults using the systems available while driving, AAA cautions that using these technologies while behind the wheel can have dangerous consequences.”

In the study, University of Utah researchers examined visual (eyes-on-the-road) and cognitive (mental) demands of each system, and looked at the time required to complete tasks. Tasks included the use of voice commands and touch screens to make calls, send texts, tune the radio and program navigation. And the results were uniformly disappointing—really disappointing.

We are going to look at the twelve (12) vehicles categorized by researchers as having “very high demand” information systems. The vehicles vary from entry-level to luxury and sedan to SUV, but they all share one common trait: AAA says the systems distract drivers.  This is to me very discouraging.  Here we go.

CONCLUSIONS:

I’m definitely NOT saying don’t buy these cars but it is worth knowing and compensating for when driving.


Portions of the following post were taken from the September 2017 Machine Design Magazine.

We all like to keep up with salary levels within our chosen profession.  It’s a great indicator of where we stand relative to our peers and the industry we participate in.  The state of the engineering profession has always been relatively stable. Engineers are as essential to the job market as doctors are to medicine. Even in the face of automation and the fear many have of losing their jobs to robots, engineers are still in high demand.  I personally do not think most engineers will be out-placed by robotic systems.  That fear definitely resides with on-line manufacturing positions with duties that are repetitive in nature.  As long as engineers can think, they will have employment.

The Machine Design Annual Salary & Career Report collected information and opinions from more than two thousand (2,000) Machine Design readers. The employee outlook is very good with thirty-three percent (33%) indicating they are staying with their current employer and thirty-six percent (36%) of employers focusing on job retention. This is up fifteen percent (15%) from 2016.  From those who responded to the survey, the average reported salary for engineers across the country was $99,922, and almost sixty percent (57.9%) reported a salary increase while only ten percent (9.7%) reported a salary decrease. The top three earning industries with the largest work forces were 1.) industrial controls systems and equipment, 2.) research & development, and 3.) medical products. Among these industries, the average salary was $104,193. The West Coast looks like the best place for engineers to earn a living with the average salary in the states of California, Washington, and Oregon was $116,684. Of course, the cost of living in these three states is definitely higher than other regions of the country.

PROFILE OF THE ENGINEER IN THE USA TODAY:

As is the ongoing trend in engineering, the profession is dominated by male engineers, with seventy-one percent (71%) being over fifty (50) years of age. However, the MD report shows an up-swing of young engineers entering the profession.  One effort that has been underway for some years now is encouraging more women to enter the profession.  With seventy-one percent (71%) of the engineering workforce being over fifty, there is a definite need to attract participants.    There was an increase in engineers within between twenty-five (25) and thirty-five (35).  This was up from 5.6% to 9.2%.  The percentage of individuals entering the profession increased as well, with engineers with less than fourteen (14) years of experience increasing five percent (5%) from last year.  Even with all the challenges of engineering, ninety-two percent (92%) would still recommend the engineering profession to their children, grandchildren and others. One engineer responds, “In fact, wherever I’ll go, I always will have an engineer’s point of view. Trying to understand how things work, and how to improve them.”

 

When asked about foreign labor forces, fifty-four percent (54%) believe H1-B visas hurt engineering employment opportunities and sixty-one percent (61%) support measures to reform the system. In terms of outsourcing, fifty-two percent (52%) reported their companies outsource work—the main reason being lack of in-house talent. However, seventy-three percent (73%) of the outsourced work is toward other U.S. locations. When discussing the future, the job force, fifty-five percent (55%) of engineers believe there is a job shortage, specifically in the skilled labor area. An overwhelming eighty-seven percent (87%) believe that we lack a skilled labor force. According to the MD readers, the strongest place for job growth is in automation at forty-five percent (45%) and the strongest place to look for skilled laborers is in vocational schools at thirty-two percent (32%). The future of engineering is dependent on the new engineers not only in school today, but also in younger people just starting their young science, technology, engineering, and mathematic (STEM) interests. With the average engineer being fifty (50) years or old, the future of engineering will rely heavily on new engineers willing to carry the torch—eighty-seven percent (87%) of our engineers believe there needs to be more focus on STEM at an earlier age to make sure the future of engineering is secure.

With being the case, let us now look at the numbers.

The engineering profession is a “graying” profession as mentioned earlier.  The next digital picture will indicate that, for the most part, those in engineering have been in for the “long haul”.  They are “lifers”.  This fact speaks volumes when trying to influence young men and women to consider the field of engineering.  If you look at “years in the profession”, “work location” and years at present employer” we see the following:

The slide below is a surprise to me and I think the first time the question has been asked by Machine Design.  How much of your engineering training is theory vs. practice? You can see the greatest response is almost fourteen percent (13.6%) with a fifty/fifty balance between theory and practice.  In my opinion, this is as it should be.

“The theory can be learned in a school, but the practical applications need to be learned on the job. The academic world is out of touch with the current reality of practical applications since they do not work in

that area.” “My university required three internships prior to graduating. This allowed them to focus significantly on theoretical, fundamental knowledge and have the internships bolster the practical.”

ENGINEERING CERTIFICATIONS:

The demands made on engineers by their respective companies can sometimes be time-consuming.  The respondents indicated the following certifications their companies felt necessary.

 

 

SALARIES:

The lowest salary is found with contract design and manufacturing.  Even this salary, would be much desired by just about any individual.

As we mentioned earlier, the West Coast provides the highest salary with several states in the New England area coming is a fairly close second.

 

SALARY LEVELS VS. EXPERIENCE:

This one should be no surprise.  The greater number of years in the profession—the greater the salary level.  Forty (40) plus years provides an average salary of approximately $100,000.  Management, as you might expect, makes the highest salary with an average being $126,052.88.

OUTSOURCING:

 

As mentioned earlier, outsourcing is a huge concern to the engineering community. The chart below indicates where the jobs go.

JOB SATISFACTION:

 

Most engineers will tell you they stay in the profession because they love the work. The euphoria created by a “really neat” design stays with an engineer much longer than an elevated pay check.  Engineers love solving problems.  Only two percent (2%) told MD they are not satisfied at all with their profession or current employer.  This is significant.

Any reason or reasons for leaving the engineering profession are shown by the following graphic.

ENGINEERING AND SOCIETY: 

As mentioned earlier, engineers are very worried about the H1-B visa program and trade policies issued by President Trump and the Legislative Branch of our country.  The Trans-Pacific Partnership has been “nixed” by President Trump but trade policies such as NAFTA and trade between the EU are still of great concern to engineers.  Trade with China, patent infringement, and cyber security remain big issues with the STEM profession and certainly engineers.

 

CONCLUSIONS:

I think it’s very safe to say that, for the most part, engineers are very satisfied with the profession and the salary levels offered by the profession.  Job satisfaction is great making the dawn of a new day something NOT to be dreaded.

AUGMENTED REALITY (AR)

October 13, 2017


Depending on the location, you can ask just about anybody to give a definition of Virtual Reality (VR) and they will take a stab at it. This is because gaming and the entertainment segments of our population have used VR as a new tool to promote games such as SuperHot VR, Rock Band VR, House of the Dying Sun, Minecraft VR, Robo Recall, and others.  If you ask them about Augmented Reality or AR they probably will give you the definition of VR or nothing at all.

Augmented reality, sometimes called Mixed Reality, is a technology that merges real-world objects or the environment with virtual elements generated by sensory input devices for sound, video, graphics, or GPS data.  Unlike VR, which completely replaces the real world with a virtual world, AR operates in real time and is interactive with objects found in the environment, providing an overlaid virtual display over the real one.

While popularized by gaming, AR technology has shown a prowess for bringing an interactive digital world into a person’s perceived real world, where the digital aspect can reveal more information about a real-world object that is seen in reality.  This is basically what AR strives to do.  We are going to take a look at several very real applications of AR to indicate the possibilities of this technology.

  • Augmented Reality has found a home in healthcare aiding preventative measures for professionals to receive information relative to the status of patients. Healthcare giant Cigna recently launched a program called BioBall that uses Microsoft HoloLense technology in an interactive game to test for blood pressure and body mass index or BMI. Patients hold a light, medium-sized ball in their hands in a one-minute race to capture all the images that flash on the screen in front of them. The Bio Ball senses a player’s heartbeat. At the University of Maryland’s Augmentarium virtual and augmented reality laboratory, the school is using AR I healthcare to improve how ultrasound is administered to a patient.  Physicians wearing an AR device can look at both a patient and the ultrasound device while images flash on the “hood” of the AR device itself.
  • AR is opening up new methods to teach young children a variety of subjects they might not be interested in learning or, in some cases, help those who have trouble in class catching up with their peers. The University of Helsinki’s AR program helps struggling kids learn science by enabling them to virtually interact with the molecule movement in gases, gravity, sound waves, and airplane wind physics.   AR creates new types of learning possibilities by transporting “old knowledge” into a new format.
  • Projection-based AR is emerging as a new way to case virtual elements in the real world without the use of bulky headgear or glasses. That is why AR is becoming a very popular alternative for use in the office or during meetings. Startups such as Lampix and Lightform are working on projection-based augmented reality for use in the boardroom, retail displays, hospitality rooms, digital signage, and other applications.
  • In Germany, a company called FleetBoard is in the development phase for application software that tracks logistics for truck drivers to help with the long series of pre-departure checks before setting off cross-country or for local deliveries. The Fleet Board Vehicle Lense app uses a smartphone and software to provide live image recognition to identify the truck’s number plate.  The relevant information is super-imposed in AR, thus speeding up the pre-departure process.
  • Last winter, Delft University of Technology in the Netherlands started working with first responders in using AR as a tool in crime scene investigation. The handheld AR system allows on-scene investigators and remote forensic teams to minimize the potential for site contamination.  This could be extremely helpful in finding traces of DNA, preserving evidence, and getting medical help from an outside source.
  • Sandia National Laboratories is working with AR as a tool to improve security training for users who are protecting vulnerable areas such as nuclear weapons or nuclear materials. The physical security training helps guide users through real-world examples such as theft or sabotage in order to be better prepared when an event takes place.  The training can be accomplished remotely and cheaply using standalone AR headsets.
  • In Finland, the VTT Technical Research Center recently developed an AR tool for the European Space Agency (ESA) for astronauts to perform real-time equipment monitoring in space. AR prepares astronauts with in-depth practice by coordinating the activities with experts in a mixed-reality situation.
  • The U.S. Daqri International uses computer vision for industrial AR to enable data visualization while working on machinery or in a warehouse. These glasses and headsets from Daqri display project data, tasks that need to be completed and potential problems with machinery or even where an object needs to be placed or repaired.

CONCLUSIONS:

Augmented Reality merges real-world objects with virtual elements generated by sensory input devices to provide great advantages to the user.  No longer is gaming and entertainment the sole objective of its use.  This brings to life a “new normal” for professionals seeking more and better technology to provide solutions to real-world problems.

DEGREE OR NO DEGREE

October 7, 2017


The availability of information in books (as always), on the Internet, through seminars and professional shows, scientific publications, pod-casts, Webinars, etc. is amazing in today’s “digital age”.  That begs the question—Is a college degree really necessary?   Can you rise to a level of competence and succeed by being self-taught?  For most, a college degree is the way to open doors. For a precious few, however, no help is needed.

Let’s look at twelve (12) individuals who did just that.

The co-founder of Apple and the force behind the iPod, iPhone, and iPad, Steve Jobs attended Reed College, an academically-rigorous liberal arts college with a heavy emphasis on social sciences and literature. Shortly after enrolling in 1972, however, he dropped out and took a job as a technician at Atari.

Legendary industrialist Howard Hughes is often said to have graduated from Cal Tech, but the truth is that the California school has no record of his having attended classes there. He did enroll at Rice University in Texas in 1924, but dropped out prematurely due the death of his father.

Arguably Harvard’s most famous dropout, Bill Gates was already an accomplished software programmer when he started as a freshman at the Massachusetts campus in 1973. His passion for software actually began before high school, at the Lakeside School in Seattle, Washington, where he was programming in BASIC by age 13.

Just like his fellow Microsoft co-founder Bill Gates, Paul Allen was a college dropout.

Like Gates, he was also a star student (a perfect score on the SAT) who honed his programming skills at the Lakeside School in Seattle. Unlike Gates, however, he went on to study at Washington State University before leaving in his second year to work as a programmer at Honeywell in Boston.

Even for his time, Thomas Edison had little formal education. His schooling didn’t start until age eight, and then only lasted a few months.

Edison said that he learned most of his reading, writing, and math at home from his mother. Still, he became known as one of America’s most prolific inventors, amassing 1,093 U.S. patents and changing the world with such devices as the phonograph, fluoroscope, stock ticker, motion picture camera, mechanical vote recorder, and long-lasting incandescent electric light bulb. He is also credited with patenting a system of electrical power distribution for homes, businesses, and factories.

Michael Dell, founder of Dell Computer Corp., seemed destined for a career in the computer industry long before he dropped out of the University of Texas. He purchased his first calculator at age seven, applied to take a high school equivalency exam at age eight, and performed his first computer teardown at age 15.

A pioneer of early television technology, Philo T. Farnsworth was a brilliant student who dropped out of Brigham Young University after the death of his father, according to Biography.com.

Although born in a log cabin, Farnsworth quickly grasped technical concepts, sketching out his revolutionary idea for a television vacuum tube while still in high school, much to the confusion of teachers and fellow students.

Credited with inventing the controls that made fixed-wing powered flight possible, the Wright Brothers had little formal education.

Neither attended college, but they gained technical knowledge from their experiences working with printing presses, bicycles, and motors. By doing so, they were able to develop a three-axis controller, which served as the means to steer and maintain the equilibrium of an aircraft.

Stanford Ovshinsky managed to amass 400 patents covering subjects ranging from nickel-metal hydride batteries to amorphous silicon semiconductors to hydrogen fuel cells, all without the benefit of a college education. He is best known for his formation of Energy Conversion Devices and his pioneering work in nickel-metal hydride batteries, which have been widely used in hybrid and electric cars, as well as laptop computers, digital cameras, and cell phones.

Preston Tucker, designer of the infamous 1948 Tucker sedan, worked as a machinist, police officer and car salesman, but was not known to have attended college. Still, he managed to become founder of the Tucker Aviation Corp. and the Tucker Corp.

Larry Ellison dropped out of his pre-med studies at the University of Illinois in his second year and left the University of Chicago after only one term, but his brief academic experiences eventually led him to the top of the computer industry.

A Harvard dropout, Mark Zuckerberg was considered a prodigy before he even set foot on campus.

He began doing BASIC programming in middle school, created an instant messaging system while in high school, and learned to read and write French, Hebrew, Latin, and ancient Greek prior to enrolling in college.

CONCLUSIONS:

In conclusions, I want to leave you with a quote from President Calvin Coolidge:

Nothing in this world can take the place of persistence. Talent will not: nothing is more common than unsuccessful men with talent. Genius will not; unrewarded genius is almost a proverb. Education will not: the world is full of educated derelicts. Persistence and determination alone are omnipotent.

AMAZING GRACE

October 3, 2017


There are many people responsible for the revolutionary development and commercialization of the modern-day computer.  Just a few of those names are given below.  Many of whom you probably have never heard of.  Let’s take a look.

COMPUTER REVOLUNTARIES:

  • Howard Aiken–Aiken was the original conceptual designer behind the Harvard Mark I computer in 1944.
  • Grace Murray Hopper–Hopper coined the term “debugging” in 1947 after removing an actual moth from a computer. Her ideas about machine-independent programming led to the development of COBOL, one of the first modern programming languages. On top of it all, the Navy destroyer USS Hopper is named after her.
  • Ken Thompson and David Ritchie–These guys invented Unix in 1969, the importance of which CANNOT be overstated. Consider this: your fancy Apple computer relies almost entirely on their work.
  • Doug and Gary Carlson–This team of brothers co-founded Brøderbund Software, a successful gaming company that operated from 1980-1999. In that time, they were responsible for churning out or marketing revolutionary computer games like Myst and Prince of Persia, helping bring computing into the mainstream.
  • Ken and Roberta Williams–This husband and wife team founded On-Line Systems in 1979, which later became Sierra Online. The company was a leader in producing graphical adventure games throughout the advent of personal computing.
  • Seymour Cray–Cray was a supercomputer architect whose computers were the fastest in the world for many decades. He set the standard for modern supercomputing.
  • Marvin Minsky–Minsky was a professor at MIT and oversaw the AI Lab, a hotspot of hacker activity, where he let prominent programmers like Richard Stallman run free. Were it not for his open-mindedness, programming skill, and ability to recognize that important things were taking place, the AI Lab wouldn’t be remembered as the talent incubator that it is.
  • Bob Albrecht–He founded the People’s Computer Company and developed a sincere passion for encouraging children to get involved with computing. He’s responsible for ushering in innumerable new young programmers and is one of the first modern technology evangelists.
  • Steve Dompier–At a time when computer speech was just barely being realized, Dompier made his computer sing. It was a trick he unveiled at the first meeting of the Homebrew Computer Club in 1975.
  • John McCarthy–McCarthy invented Lisp, the second-oldest high-level programming language that’s still in use to this day. He’s also responsible for bringing mathematical logic into the world of artificial intelligence — letting computers “think” by way of math.
  • Doug Engelbart–Engelbart is most noted for inventing the computer mouse in the mid-1960s, but he’s made numerous other contributions to the computing world. He created early GUIs and was even a member of the team that developed the now-ubiquitous hypertext.
  • Ivan Sutherland–Sutherland received the prestigious Turing Award in 1988 for inventing Sketchpad, the predecessor to the type of graphical user interfaces we use every day on our own computers.
  • Tim Paterson–He wrote QDOS, an operating system that he sold to Bill Gates in 1980. Gates rebranded it as MS-DOS, selling it to the point that it became the most widely-used operating system of the day. (How ‘bout them apples.?)
  • Dan Bricklin–He’s “The Father of the Spreadsheet. “Working in 1979 with Bob Frankston, he created VisiCalc, a predecessor to Microsoft Excel. It was the killer app of the time — people were buying computers just to run VisiCalc.
  • Bob Kahn and Vint Cerf–Prolific internet pioneers, these two teamed up to build the Transmission Control Protocol and the Internet Protocol, better known as TCP/IP. These are the fundamental communication technologies at the heart of the Internet.
  • Nicklus Wirth–Wirth designed several programming languages, but is best known for creating Pascal. He won a Turing Award in 1984 for “developing a sequence of innovative computer languages.”

ADMIREL GRACE MURRAY HOPPER:

At this point, I want to highlight Admiral Grace Murray Hopper or “amazing Grace” as she is called in the computer world and the United States Navy.  Admiral Hopper’s picture is shown below.

Born in New York City in 1906, Grace Hopper joined the U.S. Navy during World War II and was assigned to program the Mark I computer. She continued to work in computing after the war, leading the team that created the first computer language compiler, which led to the popular COBOL language. She resumed active naval service at the age of 60, becoming a rear admiral before retiring in 1986. Hopper died in Virginia in 1992.

Born Grace Brewster Murray in New York City on December 9, 1906, Grace Hopper studied math and physics at Vassar College. After graduating from Vassar in 1928, she proceeded to Yale University, where, in 1930, she received a master’s degree in mathematics. That same year, she married Vincent Foster Hopper, becoming Grace Hopper (a name that she kept even after the couple’s 1945 divorce). Starting in 1931, Hopper began teaching at Vassar while also continuing to study at Yale, where she earned a Ph.D. in mathematics in 1934—becoming one of the first few women to earn such a degree.

After the war, Hopper remained with the Navy as a reserve officer. As a research fellow at Harvard, she worked with the Mark II and Mark III computers. She was at Harvard when a moth was found to have shorted out the Mark II, and is sometimes given credit for the invention of the term “computer bug”—though she didn’t actually author the term, she did help popularize it.

Hopper retired from the Naval Reserve in 1966, but her pioneering computer work meant that she was recalled to active duty—at the age of 60—to tackle standardizing communication between different computer languages. She would remain with the Navy for 19 years. When she retired in 1986, at age 79, she was a rear admiral as well as the oldest serving officer in the service.

Saying that she would be “bored stiff” if she stopped working entirely, Hopper took another job post-retirement and stayed in the computer industry for several more years. She was awarded the National Medal of Technology in 1991—becoming the first female individual recipient of the honor. At the age of 85, she died in Arlington, Virginia, on January 1, 1992. She was laid to rest in the Arlington National Cemetery.

CONCLUSIONS:

In 1997, the guided missile destroyer, USS Hopper, was commissioned by the Navy in San Francisco. In 2004, the University of Missouri has honored Hopper with a computer museum on their campus, dubbed “Grace’s Place.” On display are early computers and computer components to educator visitors on the evolution of the technology. In addition to her programming accomplishments, Hopper’s legacy includes encouraging young people to learn how to program. The Grace Hopper Celebration of Women in Computing Conference is a technical conference that encourages women to become part of the world of computing, while the Association for Computing Machinery offers a Grace Murray Hopper Award. Additionally, on her birthday in 2013, Hopper was remembered with a “Google Doodle.”

In 2016, Hopper was posthumously honored with the Presidential Medal of Freedom by Barack Obama.

Who said women could not “do” STEM (Science, Technology, Engineering and Mathematics)?

%d bloggers like this: