OUR SHRINKING WORLD

March 16, 2019


We sometimes do not realize how miniaturization has affected our every-day lives.  Electromechanical products have become smaller and smaller with one great example being the cell phone we carry and use every day.  Before we look at several examples, let’s get a definition of miniaturization.

Miniaturization is the trend to manufacture ever smaller mechanical, optical and electronic products and devices. Examples include miniaturization of mobile phones, computers and vehicle engine downsizing. In electronics, Moore’s Law predicted that the number of transistors on an integrated circuit for minimum component cost doubles every eighteen (18) months. This enables processors to be built in smaller sizes. We can tell that miniaturization refers to the evolution of primarily electronic devices as they become smaller, faster and more efficient. Miniaturization also includes mechanical components although it sometimes is very difficult to reduce the size of a functioning part.

The revolution of electronic miniaturization began during World War II and is continuing to change the world till now. Miniaturization of computer technology has been the source of a seemingly endless battle between technology giants over the world. The market has become so competitive that the companies developing microprocessors are constantly working towards erecting a smaller microchip than that of their competitor, and as a result, computers become obsolete almost as soon as they are commercialized.  The concept that underlies technological miniaturization is “the smaller the better”; smaller is faster, smaller is cheaper, smaller is more profitable. It is not just companies that profit from miniaturization advances, but entire nations reap rewards through the capitalization of new developments. Devices such as personal computers, cellular telephones, portable radios, and camcorders have created massive markets through miniaturization, and brought billions of dollars to the countries where they were designed and built. In the 21st century, almost every electronic device has a computer chip inside. The goal of miniaturization is to make these devices smaller and more powerful, and thus made available everywhere. It has been said, however, that the time for continued miniaturization is limited – the smaller the computer chip gets, the more difficult it becomes to shrink the components that fit on the chip.  I personally do not think this is the case but I am a mechanical engineer and not an electronic or electrical engineer.  I use the products but I do not develop the products.

The world of miniaturization would not be possible at all if it were not for semiconductor technology.  Devices made of semiconductors, notably silicon, are essential components of most electronic circuits.  A process of lithography is used to create circuitry layered over a silicon substrate. A transistor is a semiconductor device with three connections capable of amplification in addition to rectification. Miniaturization entails increasing the number of transistors that can hold on a single chip, while shrinking the size of the chip. As the surface area of a chip decreases, the task of designing newer and faster circuit designs becomes more difficult, as there is less room left for the components that make the computer run faster and store more data.

There is no better example of miniaturization than cell phone development.  The digital picture you see below will give some indication as to the development of the cell phone and how the physical size has decreased over the years.  The cell phone to the far left is where it all started.  To the right, where we are today.  If you look at the modern-day cell phone you see a remarkable difference in size AND ability to communicate.  This is all possible due to shrinking computer chips.

One of the most striking changes due to miniaturization is the application of digital equipment into a modern-day aircraft cockpit.  The JPEG below is a mockup of an actual Convair 880.  With analog gauges, an engineering panel and an exterior shell, this cockpit reads 1960/1970 style design and fabrication.  In fact, this is the actual cockpit mock up that was used in the classic comedy film “Airplane”.

Now, let us take a look at a digital cockpit.  Notice any differences?  Cleaner and fewer.  The GUI or graphical user interface can take the place of numerous dials and gauges that clutter and possibly confuse a pilot’s vision.

I think you have the picture so I would challenge you to take a look this upcoming week to discover those electromechanical items, we take for granted, to discover how they have been reduced in size.  You just may be surprised.

 

Advertisements

BENDABLE BATTERIES

February 1, 2019


I always marvel at the pace of technology and how that technology fills a definite need for products only dreamt of previously.   We all have heard that “necessity is the mother of invention” well, I believe that to a tee.  We need it, we can’t find it, no one makes it, let’s invent it.  This is the way adults solve problems.  Every week technology improves our lives giving us labor-saving devices that “tomorrow” will become commonplace.  All electro-mechanical devices run on amperage provided by voltage impressed.   Many of these devices use battery power for portability.   Lithium-ion batteries seem to be the batteries of choice right now due to their ability to hold a charge and their ability to fast-charge.

Pioneer work with the lithium battery began in 1912 under G.N. Lewis but it was not until the early 1970s when the first non-rechargeable lithium batteries became commercially available. lithium is the lightest of all metals, has the greatest electrochemical potential and provides the largest energy density for weight.

The energy density of lithium-ion is typically twice that of the standard nickel-cadmium. This is a huge advantage recognized by engineers and scientists the world over.  There is potential for higher energy densities. The load characteristics are reasonably good and behave similarly to nickel-cadmium in terms of discharge. The high cell voltage of 3.6 volts allows battery pack designs with only one cell. Most of today’s mobile phones run on a single cell. A nickel-based pack would require three 1.2-volt cells connected in series.

Lithium-ion is a low maintenance battery, an advantage that most other chemistries cannot claim. There is no memory and no scheduled cycling is required to prolong the battery’s life. In addition, the self-discharge is less than half compared to nickel-cadmium, making lithium-ion well suited for modern fuel gauge applications. lithium-ion cells cause little harm when disposed.

If we look at advantages and disadvantages, we see the following:

Advantages

  • High energy density – potential for yet higher capacities.
  • Does not need prolonged priming when new. One regular charge is all that’s needed.
  • Relatively low self-discharge – self-discharge is less than half that of nickel-based batteries.
  • Low Maintenance – no periodic discharge is needed; there is no memory.
  • Specialty cells can provide very high current to applications such as power tools.

Limitations

  • Requires protection circuit to maintain voltage and current within safe limits.
  • Subject to aging, even if not in use – storage in a cool place at 40% charge reduces the aging effect.
  • Transportation restrictions – shipment of larger quantities may be subject to regulatory control. This restriction does not apply to personal carry-on batteries.
  • Expensive to manufacture – about 40 percent higher in cost than nickel-cadmium.
  • Not fully mature – metals and chemicals are changing on a continuing basis.

One amazing property of Li-Ion batteries is their ability to be formed.  Let’s take a look.

Researchers have just published documentation relative to a new technology that will definitely fill a need.

ULSAN NATIONAL INSTITUTE OF SCIENCE AND TECHNOLOGY:

Researchers at the Ulsan National Institute of Science and Technology in Korea have developed an imprintable and bendable lithium-ion battery they claim is the world’s first, and could hasten the introduction of flexible smart phones that leverage flexible display technology, such as Samsung’s Youm flexible OLED.

Samsung first demonstrated this display technology at CES 2013 as the next step in the evolution of mobile-device displays. The battery could also potentially be used in other flexible devices that debuted at the show, such as a wristwatch and a tablet.

Ulsan researchers had help on the technology from Professor John A. Rogers of the University of Illinois, researchers Young-Gi Lee and Gwangman Kim of Korea’s Electronics and Telecommunications Research Institute, and researcher Eunhae Gil of Kangwon National University. Rogers was also part of the team that developed a breakthrough in transient electronics, or electronics that dissolve inside the body.

The Korea JoongAng Daily newspaper first reported the story, citing the South Korea Ministry of Education, Science and Technology, which co-funded the research with the National Research Foundation of Korea.

The key to the flexible battery technology lies in nanomaterials that can be applied to any surface to create fluid-like polymer electrolytes that are solid, not liquid, according to Ulsan researchers. This is in contrast to typical device lithium-ion batteries, which use liquefied electrolytes that are put in square-shaped cases. Researchers say this also makes the flexible battery more stable and less prone to overheating.

“Conventional lithium-ion batteries that use liquefied electrolytes had problems with safety as the film that separates the electrolytes may melt under heat, in which case the positive and negative may come in contact, causing an explosion,” Lee told the Korean newspaper. “Because the new battery uses flexible but solid materials, and not liquids, it can be expected to show a much higher level of stability than conventional rechargeable batteries.”

This potential explosiveness of the materials in lithium-ion batteries — which in the past received attention because of exploding mobile devices — has been in the news again recently in the case of the Boeing 787 Dreamliner, which has had several instances of liquid leaking lithium-ion batteries. The problems have grounded Boeing’s next-generation jumbo jet until they are investigated and resolved.

This is a very short posting but one I felt would be of great interest to my readers.  New technology; i.e. cutting-edge stuff, etc. is fun to write about and possibly useful to learn.  Hope you enjoy this one.

Please send me your comments:  bobjengr@comcast.net.


Space Exploration Technologies Corp., doing business as SpaceX, is a private American aerospace manufacturer and space transportation services company headquartered in Hawthorne, California. SpaceX has flown twenty-five (25) resupply missions to the International Space Station (ISS) under a partnership with NASA. As you all know, NASA no longer undertakes missions of this sort but relies upon private companies such as Space X for delivery of supplies and equipment to the ISS as well as launching satellite “dishes” for communications.

BACKGROUND: 

Entrepreneur Elon Musk, founded PayPal and Tesla Motors is the visionary who started the company Space Exploration Technologies.   In early 2002 Musk was seeking staff for the new company and approached rocket engineer Tom Mueller, now SpaceX’s CTO of Propulsion.  SpaceX was first headquartered in a seventy-five thousand (75,000) square foot warehouse in El Segundo, California. Musk decided SpaceX’s first rocket would be named Falcon 1, a nod to Star Wars’ Millennium Falcon. Musk planned Falcon 1’s first launch to occurring in November 2003, fifteen (15) months after the company started. When you think about the timing, you must admit this is phenomenal and extraordinary.   Now, the fact that is was an unmanned mission certainly cut the time due to no need for safety measures to protect the crew.  No redundant systems needed other than protecting the launch and cargo itself.

In January 2005 SpaceX bought a ten percent (10%) stake in Surrey Satellite Technology and by March 2006, Musk had invested US $100 million in the company.

On August 4, 2008 SpaceX accepted a further twenty ($20) million investment from Founders Fund.   In early 2012, approximately two-thirds of the company was owned by its founder Must with seventy  (70) million shares of stock estimated to be worth $875 million on private markets.  The value of SpaceX was estimated to be at $1.3 billion as of February 2012.   After the COTS 2+ flight in May 2012, the company private equity valuation nearly doubled to $2.4 billion.

SATELLITE LAUNCH:

The latest version of SpaceX’s workhorse Falcon 9 rocket lifted off for the second time on July 22, lighting up the skies over Florida’s Space Coast in a dazzling predawn launch.  The “Block 5” variant of the two-stage Falcon 9 blasted off from Cape Canaveral Air Force Station at 1:50 a.m. EDT (0550 GMT), successfully delivering to orbit a satellite for the Canadian communications company Telesat.     Less than nine (9) minutes after launch, the rocket’s first stage came back down to Earth, a with a successful landing aboard the SpaceX drone ship “Of Course I Still Love You” a few hundred miles off the Florida coast.  The Falcon 9 may be seen with the JPEG below.

The Block 5 is the newest, most powerful and most reusable version of the Falcon 9.  Musk said the Block 5 first stages are designed to fly at least ten (10) times with just inspections between landing and liftoff, and one hundred (100) times or more with some refurbishment involved.

Such extensive reuse is key to Musk’s quest to slash the cost of spaceflight, making Mars colonization and other bold exploration efforts economically feasible. To date, SpaceX has successfully landed more than two dozen Falcon 9 first stages and re-flown landed boosters on more than a dozen occasions.

The only previous Block 5 flight occurred this past May 2018 and also involved a new rocket configuration.  The satellite lofted is called Telstar 19V, is headed for geostationary orbit, about 22,250 miles (35,800 kilometers) above Earth. Telstar 19V, which was built by California-based company SSL, will provide broadband service to customers throughout the Americas and Atlantic Ocean region, according to a Telesat fact sheet.

The booster’s first stage, sporting redesigned landing legs, improved heat shield insulation, upgraded avionics and more powerful engines with crack-resistant turbine hardware, flipped around moments after falling away from the Falcon 9’s second stage and flew itself back to an on-target landing on an offshore drone-ship.

It was the 25th successful booster recovery overall for SpaceX and the fifth so far this year, the latest demonstration of SpaceX’s maturing ability to bring orbit-class rockets back to Earth to fly again in the company’s drive to dramatically lower launch costs.

CONCLUSION:

I think the fact that Musk has taken on this project is quite extortionary.  Rocket launches, in times past, have represented an amazing expenditure of capital with the first and second stages being lost forever.  The payload, generally the third stage, go on to accomplish the ultimate mission.  Stages one and two become space debris orbiting Earth and posing a great menace to other launches.  Being able to reuse any portion of stages one and two is a great cost-effective measure and quite frankly no one really though it could be accomplished.


I feel that most individuals, certainly most adults, wonder if anyone is out there.  Are there other planets with intelligent life and is that life humanoid or at least somewhat intelligent?  The first effort would be to define intelligent.  Don’t laugh but this does have some merit and has been considered by behavioral scientists for a significant length of time.  On Earth, human intelligence took nearly four (4) Billion years to develop. If living beings develop advanced technology, they can make their existence known to the Universe. A working definition of “intelligent” includes self-awareness, use of tools, and use of language. There are other defining traits, as follows:

  • Crude perceptive abilities: Like concept of a handshake (sending a message and acknowledging receipt of one sent by you)
  • Crude communication abilities: Some primitive language and vocabulary
  • Sentience: Should be able of original thought and motivation, some form of self -awareness
  • Retention: Ability to remember and recall information on will
  • Some form of mathematical ability like counting

Please feel free to apply your own definition to intelligence. You will probably come as close as anyone to a workable one.

TESS:

NASA is looking and one manner in which the search occurs is with the new satellite TESS.

The Transiting Exoplanet Survey Satellite (TESS) is an Explorer-class planet finder.   TESS will pick up the search for exoplanets as the Kepler Space Telescope runs out of fuel.

Kepler, which has discovered more than 4,500 potential planets and confirmed exoplanets, launched in 2009. After mechanical failure in 2013, it entered a new phase of campaigns to survey other areas of the sky for exoplanets, called the K2 mission. This enabled researchers to discover even more exoplanets, understand the evolution of stars and gain insight about supernovae and black holes.

Soon, Kepler’s mission will end, and it will be abandoned in space, orbiting the sun, therefore:  never getting closer to Earth than the moon.

The spaceborne all-sky transit survey, TESS will identify planets ranging from Earth-sized to gas giants, orbiting a wide range of stellar types and orbital distances. The principal goal of the TESS mission is to detect small planets with bright host stars in the solar neighborhood, so that detailed characterizations of the planets and their atmospheres can be performed. TESS is only one satellite used to determine if there are any “goldy-locks” planets in our solar system. TESS will survey an area four hundred (400) times larger than Kepler observed. This includes two hundred thousand (200,000) of the brightest nearby stars. Over the course of two years, the four wide-field cameras on board will stare at different sectors of the sky for days at a time.

TESS will begin by looking at the Southern Hemisphere sky for the first year and move to the Northern Hemisphere in the second year. It can accomplish this lofty goal by dividing the sky into thirteen (13) sections and looking at each one for twenty-seven (27) days before moving on to the next.

The various missions launched to discover exoplanets may be seen below.

As mentioned earlier, TESS will monitor the brightness of more than two hundred thousand (200,000) stars during a two-year mission, searching for temporary drops in brightness caused by planetary transits. Transits occur when a planet’s orbit carries it directly in front of its parent star as viewed from Earth. TESS is expected to catalog more than fifteen hundred (1,500) transiting exoplanet candidates, including a sample of approximately five hundred (500) Earth-sized and ‘Super Earth’ planets, with radii less than twice that of the Earth. TESS will detect small rock-and-ice planets orbiting a diverse range of stellar types and covering a wide span of orbital periods, including rocky worlds in the habitable zones of their host stars.  This is a major undertaking and you might suspect so joint-ventures are an absolute must.  With that being the case, the major parterners in this endeavor may be seen as follows:

The project overview is given by the next pictorial.

In summary:

TESS will tile the sky with 26 observation sectors:

  • At least 27 days staring at each 24° × 96° sector
  • Brightest 200,000 stars at 1-minute cadence
  • Full frame images with 30-minute cadence
  • Map Southern hemisphere in first year
  • Map Northern hemisphere in second year
  • Sectors overlap at ecliptic poles for sensitivity to smaller and longer period planets in JWST Continuous Viewing Zone (CVZ)

TESS observes from unique High Earth Orbit (HEO):

  • Unobstructed view for continuous light curves
  • Two 13.7-day orbits per observation sector
  • Stable 2:1 resonance with Moon’s orbit
  • Thermally stable and low-radiation

The physical hardware looks as follows:

You can’t tell much about the individual components from the digital picture above but suffice it to say that TESS is a significant improvement relative to Kepler as far as technology.  The search continues and I do not know what will happen if we ever discover ET.  Imagine the areas of life that would affect?

 

 


One source for this post is Forbes Magazine article, ” U.S. Dependence on Foreign Oil Hits 30-Year Low”, by Mr. Mike Patton.  Other sources were obviously used.

The United States is at this point in time “energy independent”—for the most part.   Do you remember the ‘70s and how, at times, it was extremely difficult to buy gasoline?  If you were driving during the 1970s, you certainly must remember waiting in line for an hour or more just to put gas in the ol’ car? Thanks to the OPEC oil embargo, petroleum was in short supply. At that time, America’s need for crude oil was soaring while U.S. production was falling. As a result, the U.S. was becoming increasingly dependent on foreign suppliers. Things have changed a great deal since then. Beginning in the mid-2000s, America’s dependence on foreign oil began to decline.  One of the reasons for this decline is the abundance of natural gas or methane existent in the US.

“At the rate of U.S. dry natural gas consumption in 2015 of about 27.3 Tcf (trillion cubic feet) per year, the United States has enough natural gas to last about 86 years. The actual number of years will depend on the amount of natural gas consumed each year, natural gas imports and exports, and additions to natural gas reserves. Jul 25, 2017”

For most of the one hundred and fifty (150) years of U.S. oil and gas production, natural gas has played second fiddle to oil. That appeared to change in the mid-2000s, when natural gas became the star of the shale revolution, and eight of every 10 rigs were chasing gas targets.

But natural gas turned out to be a shooting star. Thanks to the industry’s incredible success in leveraging game-changing technology to commercialize ultralow-permeability reservoirs, the market was looking at a supply glut by 2010, with prices below producer break-even values in many dry gas shale plays.

Everyone knows what happened next. The shale revolution quickly transitioned to crude oil production, and eight of every ten (10) rigs suddenly were drilling liquids. What many in the industry did not realize initially, however, is that tight oil and natural gas liquids plays would yield substantial associated gas volumes. With ongoing, dramatic per-well productivity increases in shale plays, and associated dry gas flowing from liquids resource plays, the beat just keeps going with respect to growth in oil, NGL and natural gas supplies in the United States.

Today’s market conditions certainly are not what had once been envisioned for clean, affordable and reliable natural gas. But producers can rest assured that vision of a vibrant, growing and stable market will become a reality; it just will take more time to materialize. There is no doubt that significant demand growth is coming, driven by increased consumption in industrial plants and natural gas-fired power generation, as well as exports, including growing pipeline exports to Mexico and overseas shipments of liquefied natural gas.

Just over the horizon, the natural gas star is poised to again shine brightly. But in the interim, what happens to the supply/demand equation? This is a critically important question for natural gas producers, midstream companies and end-users alike.

Natural gas production in the lower-48 states has increased from less than fifty (50) billion cubic feet a day (Bcf/d) in 2005 to about 70 Bcf/d today. This is an increase of forty (40%) percent over nine years, or a compound annual growth rate of about four (4%) percent. There is no indication that this rate of increase is slowing. In fact, with continuing improvements in drilling efficiency and effectiveness, natural gas production is forecast to reach almost ninety (90) Bcf/d by 2020, representing another twenty-nine (29%) percent increase over 2014 output.

Most of this production growth is concentrated in a few extremely prolific producing regions. Four of these are in a fairway that runs from the Texas Gulf Coast to North Dakota through the middle section of the country, and encompasses the Eagle Ford, the Permian Basin, the Granite Wash, the SouthCentral Oklahoma Oil Play and other basins in Oklahoma, and the Williston Basin. The other major producing region is the Marcellus and Utica shales in the Northeast. Almost all the natural gas supply growth is coming from these regions.

We are at the point where this abundance can allow US companies to export LNG or liquified natural gas.   To move this cleaner-burning fuel across oceans, natural gas must be converted into liquefied natural gas (LNG), a process called liquefaction. LNG is natural gas that has been cooled to –260° F (–162° C), changing it from a gas into a liquid that is 1/600th of its original volume.  This would be the same requirement for Dayton.  The methane gas captured would need to be liquified and stored.  This is accomplished by transporting in a vessel similar to the one shown below:

As you might expect, a vessel such as this requires very specific designs relative to the containment area.  A cut-a-way is given below to indicate just how exacting that design must be to accomplish, without mishap, the transportation of LNG to other areas of the world.

Loading LNG from storage to the vessel is no easy manner either and requires another significant expenditure of capital.

For this reason, LNG facilities over the world are somewhat limited in number.  The map below will indicate their location.

A typical LNG station, both process and loading may be seen below.  This one is in Darwin.

CONCLUSIONS:

With natural gas being in great supply, there will follow increasing demand over the world for this precious commodity.  We already see automobiles using LNG instead of gasoline as primary fuel.  Also, the cost of LNG is significantly less than gasoline even with average prices over the US being around $2.00 +++ dollars per gallon.  According to AAA, the national average for regular, unleaded gasoline has fallen for thirty-five (35) out of thirty-six (36) days to $2.21 per gallon and sits at the lowest mark for this time of year since 2004. Gas prices continue to drop in most parts of the country due to abundant fuel supplies and declining crude oil costs. Average prices are about fifty-five (55) cents less than a year ago, which is motivating millions of Americans to take advantage of cheap gas by taking long road trips this summer.

I think the bottom line is: natural gas is here to stay.

THEY GOT IT ALL WRONG

November 15, 2017


We all have heard that necessity is the mother of invention.  There have been wonderful advances in technology since the Industrial Revolution but some inventions haven’t really captured the imagination of many people, including several of the smartest people on the planet.

Consider, for example, this group: Thomas Edison, Lord Kelvin, Steve Ballmer, Robert Metcalfe, and Albert Augustus Pope. Despite backgrounds of amazing achievement and even brilliance, all share the dubious distinction of making some of the worst technological predictions in history and I mean the very worst.

Had they been right, history would be radically different and today, there would be no airplanes, moon landings, home computers, iPhones, or Internet. Fortunately, they were wrong.  And that should tell us something: Even those who shape the future can’t always get a handle on it.

Let’s take a look at several forecasts that were most publically, painfully, incorrect. From Edison to Kelvin to Ballmer, click through for 10 of the worst technological predictions in history.

“Heavier-than-air flying machines are impossible.” William Thomson (often referred to as Lord Kelvin), mathematical physicist and engineer, President, Royal Society, in 1895.

A prolific scientific scholar whose name is commonly associated with the history of math and science, Lord Kelvin was nevertheless skeptical about flight. In retrospect, it is often said that Kelvin was quoted out of context, but his aversion to flying machines was well known. At one point, he is said to have publically declared that he “had not the smallest molecule of faith in aerial navigation.” OK, go tell that to Wilber and Orville.

“Fooling around with alternating current is just a waste of time. No one will use it, ever. Thomas Edison, 1889.

Thomas Edison’s brilliance was unassailable. A prolific inventor, he earned 1,093 patents in areas ranging from electric power to sound recording to motion pictures and light bulbs. But he believed that alternating current (AC) was unworkable and its high voltages were dangerous.As a result, he battled those who supported the technology. His so-called “war of currents” came to an end, however, when AC grabbed a larger market share, and he was forced out of the control of his own company.

 

“Computers in the future may weigh no more than 1.5 tons.” Popular Mechanics Magazine, 1949.

The oft-repeated quotation, which has virtually taken on a life of its own over the years, is actually condensed. The original quote was: “Where a calculator like the ENIAC today is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps weigh only 1.5 tons.” Stated either way, though, the quotation delivers a clear message: Computers are mammoth machines, and always will be. Prior to the emergence of the transistor as a computing tool, no one, including Popular Mechanics, foresaw the incredible miniaturization that was about to begin.

 

“Television won’t be able to hold on to any market it captures after the first six months. People will soon get tired of staring at a plywood box every night.” Darryl Zanuck, 20th Century Fox, 1946.

Hollywood film producer Darryl Zanuck earned three Academy Awards for Best Picture, but proved he had little understanding of the tastes of Americans when it came to technology. Television provided an alternative to the big screen and a superior means of influencing public opinion, despite Zanuck’s dire predictions. Moreover, the technology didn’t wither after six months; it blossomed. By the 1950s, many homes had TVs. In 2013, 79% of the world’s households had them.

 

“I predict the Internet will go spectacularly supernova and in 1996 catastrophically collapse.” Robert Metcalfe, founder of 3Com, in 1995.

An MIT-educated electrical engineer who co-invented Ethernet and founded 3Com, Robert Metcalfe is a holder of the National Medal of Technology, as well as an IEEE Medal of Honor. Still, he apparently was one of many who failed to foresee the unbelievable potential of the Internet. Today, 47% of the 7.3 billion people on the planet use the Internet. Metcalfe is currently a professor of innovation and Murchison Fellow of Free Enterprise at the University of Texas at Austin.

“There’s no chance that the iPhone is going to get any significant market share.” Steve Ballmer, former CEO, Microsoft Corp., in 2007.

Some magna cum laude Harvard math graduate with an estimated $33 billion in personal wealth, Steve Ballmer had an amazing tenure at Microsoft. Under his leadership, Microsoft’s annual revenue surged from $25 billion to $70 billion, and its net income jumped 215%. Still, his insights failed him when it came to the iPhone. Apple sold 6.7 million iPhones in its first five quarters, and by end of fiscal year 2010, its sales had grown to 73.5 million.

 

 

“After the rocket quits our air and starts on its longer journey, its flight would be neither accelerated nor maintained by the explosion of the charges it then might have left.” The New York Times,1920.

The New York Times was sensationally wrong when it assessed the future of rocketry in 1920, but few people of the era were in a position to dispute their declaration. Forty-one years later, astronaut Alan Shepard was the first American to enter space and 49 years later, Neil Armstrong set foot on the moon, laying waste to the idea that rocketry wouldn’t work. When Apollo 11 was on its way to the moon in 1969, the Times finally acknowledged the famous quotation and amended its view on the subject.

“With over 15 types of foreign cars already on sale here, the Japanese auto industry isn’t likely to carve out a big share of the market for itself.” Business Week, August 2, 1968.

Business Week seemed to be on safe ground in 1968, when it predicted that Japanese market share in the auto industry would be miniscule. But the magazine’s editors underestimated the American consumer’s growing distaste for the domestic concept of planned obsolescence. By the 1970s, Americans were flocking to Japanese dealerships, in large part because Japanese manufacturers made inexpensive, reliable cars. That trend has continued over the past 40 years. In 2016, Japanese automakers built more cars in the US than Detroit did.

“You cannot get people to sit over an explosion.” Albert Augustus Pope, founder, Pope Manufacturing, in the early 1900s.

Albert Augustus Pope thought he saw the future when he launched production of electric cars in Hartford, CT, in 1897. Listening to the quiet performance of the electrics, he made his now-famous declaration about the future of the internal combustion engine. Despite his preference for electrics, however, Pope also built gasoline-burning cars, laying the groundwork for future generations of IC engines. In 2010, there were more than one billion vehicles in the world, the majority of which used internal combustion propulsion.

 

 

 

“I have traveled the length and breadth of this country and talked to the best people, and I can assure you that data processing is a fad that won’t last out the year.” Editor, Prentice Hall Books,1957.

The concept of data processing was a head-scratcher in 1957, especially for the unnamed Prentice Hall editor who uttered the oft-quoted prediction of its demise. The prediction has since been used in countless technical presentations, usually as an example of our inability to see the future. Amazingly, the editor’s forecast has recently begun to look even worse, as Internet of Things users search for ways to process the mountains of data coming from a new breed of connected devices. By 2020, experts predict there will be 30 to 50 billion such connected devices sending their data to computers for processing.

CONCLUSIONS:

Last but not least, Charles Holland Duell in 1898 was appointed as the United States Commissioner of Patents, and held that post until 1901.  In that role, he is famous for purportedly saying “Everything that can be invented has been invented.”  Well Charlie, maybe not.


Portions of the following post were taken from the September 2017 Machine Design Magazine.

We all like to keep up with salary levels within our chosen profession.  It’s a great indicator of where we stand relative to our peers and the industry we participate in.  The state of the engineering profession has always been relatively stable. Engineers are as essential to the job market as doctors are to medicine. Even in the face of automation and the fear many have of losing their jobs to robots, engineers are still in high demand.  I personally do not think most engineers will be out-placed by robotic systems.  That fear definitely resides with on-line manufacturing positions with duties that are repetitive in nature.  As long as engineers can think, they will have employment.

The Machine Design Annual Salary & Career Report collected information and opinions from more than two thousand (2,000) Machine Design readers. The employee outlook is very good with thirty-three percent (33%) indicating they are staying with their current employer and thirty-six percent (36%) of employers focusing on job retention. This is up fifteen percent (15%) from 2016.  From those who responded to the survey, the average reported salary for engineers across the country was $99,922, and almost sixty percent (57.9%) reported a salary increase while only ten percent (9.7%) reported a salary decrease. The top three earning industries with the largest work forces were 1.) industrial controls systems and equipment, 2.) research & development, and 3.) medical products. Among these industries, the average salary was $104,193. The West Coast looks like the best place for engineers to earn a living with the average salary in the states of California, Washington, and Oregon was $116,684. Of course, the cost of living in these three states is definitely higher than other regions of the country.

PROFILE OF THE ENGINEER IN THE USA TODAY:

As is the ongoing trend in engineering, the profession is dominated by male engineers, with seventy-one percent (71%) being over fifty (50) years of age. However, the MD report shows an up-swing of young engineers entering the profession.  One effort that has been underway for some years now is encouraging more women to enter the profession.  With seventy-one percent (71%) of the engineering workforce being over fifty, there is a definite need to attract participants.    There was an increase in engineers within between twenty-five (25) and thirty-five (35).  This was up from 5.6% to 9.2%.  The percentage of individuals entering the profession increased as well, with engineers with less than fourteen (14) years of experience increasing five percent (5%) from last year.  Even with all the challenges of engineering, ninety-two percent (92%) would still recommend the engineering profession to their children, grandchildren and others. One engineer responds, “In fact, wherever I’ll go, I always will have an engineer’s point of view. Trying to understand how things work, and how to improve them.”

 

When asked about foreign labor forces, fifty-four percent (54%) believe H1-B visas hurt engineering employment opportunities and sixty-one percent (61%) support measures to reform the system. In terms of outsourcing, fifty-two percent (52%) reported their companies outsource work—the main reason being lack of in-house talent. However, seventy-three percent (73%) of the outsourced work is toward other U.S. locations. When discussing the future, the job force, fifty-five percent (55%) of engineers believe there is a job shortage, specifically in the skilled labor area. An overwhelming eighty-seven percent (87%) believe that we lack a skilled labor force. According to the MD readers, the strongest place for job growth is in automation at forty-five percent (45%) and the strongest place to look for skilled laborers is in vocational schools at thirty-two percent (32%). The future of engineering is dependent on the new engineers not only in school today, but also in younger people just starting their young science, technology, engineering, and mathematic (STEM) interests. With the average engineer being fifty (50) years or old, the future of engineering will rely heavily on new engineers willing to carry the torch—eighty-seven percent (87%) of our engineers believe there needs to be more focus on STEM at an earlier age to make sure the future of engineering is secure.

With being the case, let us now look at the numbers.

The engineering profession is a “graying” profession as mentioned earlier.  The next digital picture will indicate that, for the most part, those in engineering have been in for the “long haul”.  They are “lifers”.  This fact speaks volumes when trying to influence young men and women to consider the field of engineering.  If you look at “years in the profession”, “work location” and years at present employer” we see the following:

The slide below is a surprise to me and I think the first time the question has been asked by Machine Design.  How much of your engineering training is theory vs. practice? You can see the greatest response is almost fourteen percent (13.6%) with a fifty/fifty balance between theory and practice.  In my opinion, this is as it should be.

“The theory can be learned in a school, but the practical applications need to be learned on the job. The academic world is out of touch with the current reality of practical applications since they do not work in

that area.” “My university required three internships prior to graduating. This allowed them to focus significantly on theoretical, fundamental knowledge and have the internships bolster the practical.”

ENGINEERING CERTIFICATIONS:

The demands made on engineers by their respective companies can sometimes be time-consuming.  The respondents indicated the following certifications their companies felt necessary.

 

 

SALARIES:

The lowest salary is found with contract design and manufacturing.  Even this salary, would be much desired by just about any individual.

As we mentioned earlier, the West Coast provides the highest salary with several states in the New England area coming is a fairly close second.

 

SALARY LEVELS VS. EXPERIENCE:

This one should be no surprise.  The greater number of years in the profession—the greater the salary level.  Forty (40) plus years provides an average salary of approximately $100,000.  Management, as you might expect, makes the highest salary with an average being $126,052.88.

OUTSOURCING:

 

As mentioned earlier, outsourcing is a huge concern to the engineering community. The chart below indicates where the jobs go.

JOB SATISFACTION:

 

Most engineers will tell you they stay in the profession because they love the work. The euphoria created by a “really neat” design stays with an engineer much longer than an elevated pay check.  Engineers love solving problems.  Only two percent (2%) told MD they are not satisfied at all with their profession or current employer.  This is significant.

Any reason or reasons for leaving the engineering profession are shown by the following graphic.

ENGINEERING AND SOCIETY: 

As mentioned earlier, engineers are very worried about the H1-B visa program and trade policies issued by President Trump and the Legislative Branch of our country.  The Trans-Pacific Partnership has been “nixed” by President Trump but trade policies such as NAFTA and trade between the EU are still of great concern to engineers.  Trade with China, patent infringement, and cyber security remain big issues with the STEM profession and certainly engineers.

 

CONCLUSIONS:

I think it’s very safe to say that, for the most part, engineers are very satisfied with the profession and the salary levels offered by the profession.  Job satisfaction is great making the dawn of a new day something NOT to be dreaded.

%d bloggers like this: