HALF SMART

August 23, 2014


The other day I was visiting a client and discussing a project involving the application of a robotic system to an existing work cell.  The process is somewhat complex and we all questioned which employee would manage the operation of the cell including the system.  The system is a SCARA type.  SCARA is an acronym for Selective Compliance Assembly Robot Arm or Selective Compliance Articulated Robot Arm.

In 1981, Sankyo SeikiPentel and NEC presented a completely new concept for assembly robots. The robot was developed under the guidance of Hiroshi Makino, a professor at the University of Yamanashi and was called the Selective Compliance Assembly Robot Arm or SCARA.

SCARA’s are generally faster and cleaner than comparable Cartesian (X, Y, Z) robotic systems.  Their single pedestal mount requires a small footprint and provides an easy, unhindered form of mounting. On the other hand, SCARA’s can be more expensive than comparable Cartesian systems and the controlling software requires inverse kinematics for linear interpolated moves. This software typically comes with the SCARA however and is usually transparent to the end-user.   The SCARA system used in this work cell had the capability of one hundred programs with 100 data points per program.  It was programmed by virtue of a “teach pendant” and “jog” switch controlling the placement of the robotic arm over the material.

Several names were mentioned as to who might ultimately, after training, be capable of taking on this task.  When one individual was named, the retort was; “not James, he is only half smart.  That got me to thinking about “smarts”.  How smart is smart?   At what point do we say smart is smart enough?

IQ CHARTS—WHO’S SMART

The concept of IQ or intelligence quotient was developed by either the German psychologist and philosopher Wilhelm Stern in 1912 or by Lewis Terman in 1916.  This is depending on which of several sources you consult.   Intelligence testing was initially accomplished on a large scale before either of these dates. In 1904 psychologist Alfred Binet was commissioned by the French government to create a testing system to differentiate intellectually normal children from those who were inferior.

From Binet’s work the IQ scale called the “Binet Scale,” (and later the “Simon-Binet Scale”) was developed. Sometime later, “intelligence quotient,” or “IQ,” entered our vocabulary.  Lewis M. Terman revised the Simon-Binet IQ Scale, and in 1916 published the Stanford Revision of the Binet-Simon Scale of Intelligence (also known as the Stanford-Binet).

Intelligence tests are one of the most popular types of psychological tests in use today. On the majority of modern IQ tests, the average (or mean) score is set at 100 with a standard deviation of 15 so that scores conform to a normal distribution curve.  This means that 68 percent of scores fall within one standard deviation of the mean (that is, between 85 and 115), and 95 percent of scores fall within two standard deviations (between 70 and 130).  This may be shown from the following bell-shaped curve:

Bell-Shaped Curve Showing IQ

Why is the average score set to 100?  Psychometritians, individuals who study the biology of the brain, utilize a process known as standardization in order to make it possible to compare and interpret the meaning of IQ scores. This process is accomplished by administering the test to a representative sample and using these scores to establish standards, usually referred to as norms, by which all individual scores can be compared. Since the average score is 100, experts can quickly assess individual test scores against the average to determine where these scores fall on the normal distribution.

The following scale resulted for classifying IQ scores:
IQ Scale

Over 140 – Genius or almost genius
120 – 140 – Very superior intelligence
110 – 119 – Superior intelligence
90 – 109 – Average or normal intelligence
80 – 89 – Dullness
70 – 79 – Borderline deficiency in intelligence
Under 70 – Feeble-mindedness

Normal Distribution of IQ Scores

From the curve above, we see the following:

50% of IQ scores fall between 90 and 110
68% of IQ scores fall between 85 and 115
95% of IQ scores fall between 70 and 130
99.5% of IQ scores fall between 60 and 140

Low IQ & Mental Retardation

An IQ under 70 is considered as “mental retardation” or limited mental ability. 5% of the population falls below 70 on IQ tests. The severity of the mental retardation is commonly broken into 4 levels:

50-70 – Mild mental retardation (85%)
35-50 – Moderate mental retardation (10%)
20-35 – Severe mental retardation (4%)
IQ < 20 – Profound mental retardation (1%)

High IQ & Genius IQ

Genius or near-genius IQ is considered to start around 140 to 145. Less than 1/4 of 1 percent fall into this category. Here are some common designations on the IQ scale:

115-124 – Above average
125-134 – Gifted
135-144 – Very gifted
145-164 – Genius
165-179 – High genius
180-200 – Highest genius

We are told “Big Al” had an IQ over 160 which would definitely qualify him as being one the most intelligent people on the planet.

Big Al and IQ

Looking at demographics, we see the following:

How Smart is Smart

As you can see, the percentage of individuals considered to be genius is quite small. 0.50 percent to be exact.  OK, who are these people?

  1. Stephen Hawking

Dr. Hawking is a man of Science, a theoretical physicist and cosmologist.  Hawking has never failed to astonish everyone with his IQ level of 160. He was born in Oxford, England and has proven himself to be a remarkably intelligent person.   Hawking is an Honorary Fellow of the Royal Society of Arts, a lifetime member of the Pontifical Academy of Sciences, and a recipient of the Presidential Medal of Freedom, the highest civilian award in the United States.  Hawking was the Lucasian Professor of Mathematics at the University of Cambridge between 1979 and 2009. Hawking has a motor neuron disease related to amyotrophic lateral sclerosis (ALS), a condition that has progressed over the years. He is almost entirely paralyzed and communicates through a speech generating device. Even with this condition, he maintains a very active schedule demonstrating significant mental ability.

  1. Andrew Wiles

Sir Andrew John Wiles is a remarkably intelligent individual.  Sir Andrew is a British mathematician, a member of the Royal Society, and a research professor at Oxford University.  His specialty is numbers theory.  He proved Fermat’s last theorem and for this effort, he was awarded a special silver plaque.    It is reported that he has an IQ of 170.

  1. Paul Gardner Allen

Paul Gardner Allen is an American business magnate, investor and philanthropist, best known as the co-founder of The Microsoft Corporation. As of March 2013, he was estimated to be the 53rd-richest person in the world, with an estimated wealth of $15 billion. His IQ is reported to be 170. He is considered to be the most influential person in his field and known to be a good decision maker.

  1. Judit Polgar

Born in Hungary in 1976, Judit Polgár is a chess grandmaster. She is by far the strongest female chess player in history. In 1991, Polgár achieved the title of Grandmaster at the age of 15 years and 4 months, the youngest person to do so until then. Polgar is not only a chess master but a certified brainiac with a recorded IQ of 170. She lived a childhood filled with extensive chess training given by her father. She defeated nine former and current world champions including Garry Kasparov, Boris Spassky, and Anatoly Karpov.  Quite amazing.

  1. Garry Kasparov

Garry Kasparov has totally amazed the world with his outstanding IQ of more than 190. He is a Russian chess Grandmaster, former World Chess Champion, writer, and political activist, considered by many to be the greatest chess player of all time. From 1986 until his retirement in 2005, Kasparov was ranked world No. 1 for 225 months.  Kasparov became the youngest ever undisputed World Chess Champion in 1985 at age 22 by defeating then-champion Anatoly Karpov.   He held the official FIDE world title until 1993, when a dispute with FIDE led him to set up a rival organization, the Professional Chess Association. In 1997 he became the first world champion to lose a match to a computer under standard time controls, when he lost to the IBM supercomputer Deep Blue in a highly publicized match. He continued to hold the “Classical” World Chess Championship until his defeat by Vladimir Kramnik in 2000.

  1. Rick Rosner

Gifted with an amazing IQ of 192.  Richard G. “Rick” Rosner (born May 2, 1960) is an American television writer and media figure known for his high intelligence test scores and his unusual career. There are reports that he has achieved some of the highest scores ever recorded on IQ tests designed to measure exceptional intelligence. He has become known for taking part in activities not usually associated with geniuses.

  1. Kim Ung-Yong

With a verified IQ of 210, Korean civil engineer Ung Yong is considered to be one of the smartest people on the planet.  He was born March 7, 1963 and was definitely a child prodigy .  He started speaking at the age of 6 months and was able to read Japanese, Korean, German, English and many other languages by his third birthday. When he was four years old, his father said he had memorized about 2000 words in both English and German.  He was writing poetry in Korean and Chinese and wrote two very short books of essays and poems (less than 20 pages). Kim was listed in the Guinness Book of World Records under “Highest IQ“; the book gave the boy’s score as about 210. [Guinness retired the “Highest IQ” category in 1990 after concluding IQ tests were too unreliable to designate a single record holder.

 

  1. Christopher Hirata

Christopher Hirata’s  IQ is approximately 225 which is phenomenal. He was genius from childhood. At the age of 16, he was working with NASA with the Mars mission.  At the age of 22, he obtained a PhD from Princeton University.  Hirata is teaching astrophysics at the California Institute of Technology.

  1. Marilyn vos Savant

Marilyn Vos Savant is said to have an IQ of 228. She is an American magazine columnist, author, lecturer, and playwright who rose to fame as a result of the listing in the Guinness Book of World Records under “Highest IQ.” Since 1986 she has written “Ask Marilyn,” a Parade magazine Sunday column where she solves puzzles and answers questions on various subjects.

1.Terence Tao

Terence Tao is an Australian mathematician working in harmonic analysis, partial differential equations, additive combinatorics, ergodic Ramsey theory, random matrix theory, and analytic number theory.  He currently holds the James and Carol Collins chair in mathematics at the University of California, Los Angeles where he became the youngest ever promoted to full professor at the age of 24 years. He was a co-recipient of the 2006 Fields Medal and the 2014 Breakthrough Prize in Mathematics.

Tao was a child prodigy, one of the subjects in the longitudinal research on exceptionally gifted children by education researcher Miraca Gross. His father told the press that at the age of two, during a family gathering, Tao attempted to teach a 5-year-old child arithmetic and English. According to Smithsonian Online Magazine, Tao could carry out basic arithmetic by the age of two. When asked by his father how he knew numbers and letters, he said he learned them from Sesame Street.

OK, now before you go running to jump from the nearest bridge, consider the statement below:

Persistence—President Calvin Coolidge said it better than anyone I have ever heard. “Nothing in the world can take the place of persistence. Talent will not; nothing is more common than unsuccessful men with talent.   Genius will not; unrewarded genius is almost a proverb. Education will not; the world is full of educated derelicts. Persistence and determination alone are omnipotent.  The slogan “Press on” has solved and always will solve the problems of the human race.” 

I personally think Calvin really knew what he was talking about.  Most of us get it done by persistence!! ‘Nuff” said.

PAYBACK

August 23, 2014


It is a very sad day when we lose an American citizen and doubly sad when the loss is due to terrorist activity.  James Wright Foley, a photo-journalist, was captured by ISIS while filming in Syria.  He was held captive over a year and beheaded by that remarkably brutal terrorist organization this past week.  The gruesome video, posted on U-Tube, has now been taken down.

I have seen digital photographs of children, Christian children, beheaded by these thugs.  They will stop at nothing to spread fear throughout the Middle-East and eventually to Western powers unless stopped.  They apparently are well-funded, well-organized and use American weapons left when military forces from Iraq deserted their posts.  They, for the most part, offered no resistance to the ISIS movements from Syria into Iraq and, of course, we provided no incentives for them to turn back.  We watched and did nothing.  We did not heed the warning and now it appears the “cat is out of the bag”.

I have no idea what our response, if any, will be but reality indicates we must do something to stop this spread of terror. The only manner seemingly effective is elimination—kill them.  They cannot be reasoned with and diplomacy obviously will not be the path leading to resolution of this growing problem. If we look at those areas controlled by ISIS, we see the following:

ISIS MAP

We are told they are coming for us and will not be satisfied until their flag flies from our White House.

All indications are there will be no “boots on the ground”.  If a military response is planed, it will be by virtue of air power.  Maybe that will be enough but who really knows?   With that being the case, let’s look at what we have in our arsenal.

F-22 Raptor

F-22 Raptor

This is the era of the F-22 Raptor – the world’s premier 5th Generation fighter.

The F-22 is the only fighter capable of simultaneously conducting air-to-air and air-to-ground combat missions with near impunity. This is accomplished with a never-before-seen standard of survivability even in the face of sophisticated airborne and ground-based threats.

In addition to being America’s premier air-superiority fighter, the F-22 evolved from its original concept to become a lethal, survivable and flexible multi-mission fighter. By taking advantage of emerging technologies, the F-22 has emerged as a superior platform for many diverse missions including intelligence gathering, surveillance, reconnaissance and electronic attack.

The Raptor is operational today, protecting our homeland and combat ready for worldwide deployment. F-22s are already assigned to multiple bases across the country.

F-35 Lightning II

F-35 Lightning

The Lockheed Martin F-35 Lightning II is a family of single-seat, single-engine, all weather stealth multirole fighters currently under development. The fifth generation combat aircraft is designed to perform ground attackreconnaissance, and air defense missions. The F-35 has three main models: the F-35A conventional takeoff and landing (CTOL) variant, the F-35B short take-off and vertical-landing (STOVL) variant, and the F-35C carrier-based CATOBAR (CV) variant.

The F-35 is descended from the X-35, which was the winning design of the Joint Strike Fighter (JSF) program. It is being designed and built by an aerospace industry team led by Lockheed Martin. Other major F-35 industry partners include Northrop GrummanPratt & Whitney and BAE Systems. The F-35 took its first flight on 15 December 2006. The United States plans to buy 2,443 aircraft. The F-35 variants are intended to provide the bulk of its manned tactical airpower for the U.S. Air Force, Marine Corps and Navy over the coming decades. Deliveries of the F-35 for the U.S. military are scheduled to be completed in 2037.  It should be noted here that problems do exists with this aircraft and it is not yet fully operational.

F-15

F-15

The F-15E Strike Eagle is a superior next generation multi-role strike fighter that is available today. Its unparalleled range, persistence and weapons load make it the backbone of the U.S. Air Force (USAF). A complement of the latest advanced avionics systems gives the Strike Eagle the capability to perform air-to-air or air-to-surface missions at all altitudes, day or night, in any weather.

The F-15 is a twin-engine, high-performance, all-weather air superiority fighter. First flown in 1972, the Eagle entered U.S. Air Force service in 1974. The Eagle’s most notable characteristics are its great acceleration and maneuverability. It was the first U.S. fighter with engine thrust greater than the basic weight of the aircraft, allowing it to accelerate while in a vertical climb. Its great power, light weight and large wing area combine to make the Eagle very agile.

The F-15 has been produced in single-seat and two-seat versions in its many years of USAF service. The two-seat F-15E Strike Eagle version is a dual-role fighter that can engage both ground and air targets. F-15C, -D, and -E models participated in OPERATION DESERT STORM in 1991, accounting for 32 of 36 USAF air-to-air victories and also attacking Iraqi ground targets. F-15s also served in Bosnia (1994), downed three Serbian MiG-29 fighters in OPERATION ALLIED FORCE (1999), and enforced no-fly zones over Iraq in the 1990s. Eagles also hit Afghan targets in OPERATION ENDURING FREEDOM, and the F-15E version performed air-to-ground missions in OPERATION IRAQI FREEDOM.

F-16

F-16

The General Dynamics (now Lockheed Martin) F-16 Fighting Falcon is a single-engine multirole fighter aircraft originally developed by General Dynamics for the United States Air Force (USAF). Designed as an air superiority day fighter, it evolved into a successful all-weather multirole aircraft. Over 4,500 aircraft have been built since production was approved in 1976.  Although no longer being purchased by the U.S. Air Force, improved versions are still being built for export customers. In 1993, General Dynamics sold its aircraft manufacturing business to the Lockheed Corporation, which in turn became part of Lockheed Martin after a 1995 merger with Martin Marietta.

F-117

F-117

The F-117A Nighthawk is the world’s first operational aircraft designed to exploit low-observable stealth technology. The unique design of the single-seat F-117A provides exceptional combat capabilities. About the size of an F-15 Eagle, the twin-engine aircraft is powered by two General Electric F404 turbofan engines and has quadruple redundant fly-by-wire flight controls. Air refuelable, it supports worldwide commitments and adds to the deterrent strength of the U.S. military forces.

The first F-117A was delivered in 1982, and the last delivery was in the summer of 1990. The F-117A production decision was made in 1978 with a contract awarded to Lockheed Advanced Development Projects, the “Skunk Works,” in Burbank, Calif. The first flight was in 1981, only 31 months after the full-scale development decision. Lockheed-Martin delivered 59 stealth fighters to the Air Force between August 1982 and July 1990. Five additional test aircraft belong to the company.

FA-18

FA-18

The McDonnell Douglas (now BoeingF/A-18 Hornet is a twin-engine supersonic, all-weather carrier-capable multirole combat jet, designed as both a fighter and attack aircraft (F/A designation for Fighter/Attack). Designed by McDonnell Douglas and Northrop, the F/A-18 was derived from the latter’s YF-17 in the 1970s for use by the United States Navy and Marine Corps. The Hornet is also used by the air forces of several other nations. The U.S. Navy’s Flight Demonstration Squadron, the Blue Angels, has used the Hornet since 1986.

The F/A-18 has a top speed of Mach 1.8 (1,190 mph or 1,915 km/h at 40,000 ft or 12,190 m). It can carry a wide variety of bombs and missiles, including air-to-air and air-to-ground, supplemented by the 20 mm M61 Vulcan cannon. It is powered by two General Electric F404 turbofan engines, which give the aircraft a high thrust-to-weight ratio. The F/A-18 has excellent aerodynamic characteristics, primarily attributed to its leading edge extensions (LEX). The fighter’s primary missions are fighter escort, fleet air defenseSuppression of Enemy Air Defenses (SEAD), air interdictionclose air support and aerial reconnaissance. Its versatility and reliability have proven it to be a valuable carrier asset, though it has been criticized for its lack of range and payload compared to its earlier contemporaries, such as the Grumman F-14 Tomcat in the fighter and strike fighter role, and the Grumman A-6 Intruder and LTV A-7 Corsair II in the attack role.

A-10

A-10

The A-10 Thunderbolt II, affectionately nicknamed “The Warthog,” was developed for the United States Air Force by the OEM Team from Fairchild Republic Company, now a part of Northrop Grumman Corporation Aerospace Systems Eastern Region located in Bethpage NY and St. Augustine FL. Following in the footsteps of the legendary P47 Thunderbolt, the OEM Team was awarded a study contract in the 1960s to define requirements for a new Close Air Support aircraft, rugged and survivable, to protect combat troops on the ground. This initial study was followed up by a prototype development contract for the A-X, and a final fly-off competition resulting in the selection of the A-10 Thunderbolt II.

Selection of the A-10 Thunderbolt II for this mission was based on the dramatic low altitude maneuverability, lethality, “get home safe” survivability, and mission capable maintainability designed into the jet by the OEM team. This design features a titanium “bathtub” that protects the pilot from injury, and dually redundant flight control systems that allow the pilot to fly the aircraft out of enemy range, despite severe damage such as complete loss of hydraulic capability. These features have been utilized to great effect in both the Desert Storm conflict of the 1990’s and in the more recent Enduring Freedom, Iraqi Freedom, and Global War on Terror engagements.

In 1987, the ™A-10 OEM Team and all A-10 assets were acquired by Grumman Corporation from Fairchild Republic Company, and are now part of the Northrop Grumman Aerospace Systems Eastern Region, presently partnered with Lockheed Martin Systems Integration as a member of the A-10 Prime Team.

A/V-8B

AV 8B Harrier

The Harrier today is one of the truly unique and most widely known of military aircraft. It is unique as the only fixed wing V/STOL aircraft in the free world. It also is unusual in the international nature of its development, which brought the design from the first British P.1127 prototype to the AV-8B Harrier II of today.

When the Harrier II was first flown in the fall of 1981, 21 years had elapsed since the original Hawker P.1127 first hovered in untethered flight. This basic design, only one of many promising concepts of the time, has weathered its growing up period and reached maturity in the AV-8B.

The 1957 design for the P.1127 was based on a French engine concept, adopted and improved upon by the British. The project was funded by the British Bristol Engine Co. and by the U.S. Government through the Mutual Weapons Development Program.

With the basic configuration of the engine largely determined and with development work under way, Hawker Aircraft Ltd. engineers directed their attention to designing a V/STOL aircraft that would use the engine. Without government/military customer support, they produced a single-engine attack-reconnaissance design that was as simple a V/STOL aircraft as could be devised. Other than the engine’s swivelling nozzles, the reaction control system was the only complication in the effort to provide V/STOL capability.

F-14

F-14

The F-14 Tomcat is a supersonic, twin-engine, variable sweep wing, two-place strike fighter manufactured by Grumman Aircraft Corporation. The multiple tasks of navigation, target acquisition, electronic counter measures (ECM), and weapons employment are divided between the pilot and the radar intercept officer (RIO). Primary missions include precision strike against ground targets, air superiority, and fleet air defense.

The F-14 Tomcat is a supersonic, twin-engine, variable sweep wing, two-place strike fighter manufactured by Grumman Aircraft Corporation. The multiple tasks of navigation, target acquisition, electronic counter measures (ECM), and weapons employment are divided between the pilot and the radar intercept officer (RIO). Primary missions include precision strike against ground targets, air superiority, and fleet air defense.

The F-14 has completed its decommissioning from the U.S. Navy. It was slated to remain in service through at least 2008, but all F-14A and F-14B airframes have already been retired, and the last two squadrons, the VF-31 Tomcatters and the VF-213 Black Lions, both flying the “D” models, arrived for their last fly-in at Naval Air Station Oceana on March 10, 2006.

CONCLUSIONS

I think ISIS, or ISIL, is a real and present danger.  We can no longer talk our way around this situation; this is no solution.  Playing golf is no solution.  Waiting for the next administration is no solution. The only recourse we have is to kill them.  Do not let ISIS live to see another sunrise.  Let’s let them enjoy their seventy-seven (77) virgins sooner rather than later.  I would enjoy your comments.

NASA TECH BRIEFS

August 9, 2014


One of the very best publications existing today is “NASA TECH BRIEFS, Engineering Solutions for Design & Manufacturing”.  This monthly publication strives to transfer technology from NASA design centers to University and corporate entities hoping the research and development can be commercialized in some fashion.  In my opinion, it is a marvelous resource and demonstrates avenues of investigation separate and apart from what we have come to know as the recognized NASA mission.   As you well know, in the process of exploration, there are many very useful “down-to-Earth” developments that can utilize and commercialized to benefit manufacturing and our populace at large.  These are enumerated in this publication.  Several distinct areas within the magazine highlighting papers and studies may be seen as follows:

  • Technology Focus: Mechanical  Components
  • Manufacturing & Prototyping
  • Materials & Coatings
  • Electronics/Components
  • Physical Sciences
  • Patents of Note
  • New For Design Engineers

As you can see, each of these areas concentrates upon differing subjects, all relating to engineering and product design.

Let me now mention several publications and papers coming from the Volume 38, Number 8 edition.  This will give you some feel for the investigative work coming from the NASA research centers across our country.  These are in the August 2014 magazine.

  • “Extreme Low Frequency Acoustic Measurement System”, Langley Research Center, Hampton, Va.
  • “Piezoelectric Actuated Valve for Operation in Extreme Conditions”, Jet Propulsion Laboratory, Pasadena, California.
  • “Compact Active Vibration Control System”, Langley Research Center, Hampton, Va.
  • “Rotary Series Elastic Actuator”, L.B.J Space Center, Houston, Texas.
  • “HALT Technique to Predict the Reliability of Solder Joints in a Shorter Duration”, Jet Propulsion Laboratory, Pasadena, California.

I feel one the great failure of our federal government is the abdication of manned-space programs.  WE REALLY SCREWED UP on this one.  If you have read any of my previous posting on this subject you will understand my complete and utter amazement relative to that decision by the Executive and Legislative branches of our government.  This, to some extent, underscores the deplorable lack of vision existing at the highest levels.   We have decided to let the Russians get us up and back.  Very bad decision on our part.  Now, it is important to note that NASA is far from being dormant-NASA is working.

Let’s take a look at the various NASA locations and the areas of research they are undertaking.

  • Ames Research Center:  Technological Strengths: Information Technology, Biotechnology, Nanotechnology, Aerospace Operations Systems, Rotorcraft, Thermal Protection Systems.
  • Armstrong Flight Research Center: Technological Strengths: Aerodynamics, Aeronautics Flight Testing, Aeropropulsion, Flight Systems, Thermal Testing Integrated Systems Test and Validation.
  • Glenn Research Center: Technological Strengths: Aeropropulsion, Communications, Energy Technology, High-Temperature Materials Research.
  • Goddard Space Flight Center: Technological Strengths:  Earth and Planetary Science Missions, LIDAR, Cryogenic Systems, Tracking, Telemetry, Remote Sensing, Command.
  • Jet Propulsion Laboratory: Technological Strengths: Near/Deep-Space Mission Engineering, Microspacecraft, Space Communications, Information Systems, Remote Sensing, Robotics.
  • Johnson Space Center:  Technological Strengths: Artificial Intelligence and Human Computer Interface, Life Sciences, Human Space Flight Operations, Avionics, Sensors, Communication.
  • Kennedy Space Center: Technological Strengths: Fluids and Fluid Systems, Materials Evaluation, Process Engineering Command, Control, and Monitor Systems, Range Systems, Environmental Engineering and Management.
  • Langley Research Center:  Technological Strengths: Aerodynamics, Flight Systems, Materials, Structures, Sensors, Measurements, Information Sciences.
  • Marshall Space Flight Center: Technological Strengths: Materials, Manufacturing, Nondestructive Evaluations, Biotechnology, Space Propulsion, Controls and Dynamics, Structures, Microgravity Processing.
  • Stennis Space Center: Technological Strengths: Propulsion Systems, Test/Monitoring, Remote Sensing, Nonintrusive Instrumentation.
  • NASA Headquarters: Technological Strengths: NASA Planning and Management.

I can strongly recommend to you the “Tech Brief” publication.  It’s free.  You may find further investigation into the areas of research can benefit you and your company.  Take a look.

As always, I welcome your comments.  Many thanks.

RELIABILITY ENGINEERING

August 5, 2014


The following post is taken from a PDHonline course this author has written for professional engineers. The entire course may be found from PDHonline.org.  Look for Introduction to Reliability Engineering.

INTRODUCTION:

One of the most difficult issues when designing a product is determining how long it will last and how long it should last.  If the product is robust to the point of lasting “forever” the price of purchase will probably be prohibitive compared with competition.    If it “dies” the first week, you will eventually lose all sales momentum and your previous marketing efforts will be for naught.   It is absolutely amazing to me as to how many products are dead on arrival.  They don’t work, right out of the box. This is an indication of slipshod design, manufacturing, assembly or all of the above.  It is definitely possible to design and build quality and reliability into a product so that the end user is very satisfied and feels as though he got his money’s worth.     The medical, automotive, aerospace and weapons industries are certainly dependent upon reliability methods to insure safe and usable products so premature failure is not an issue.  The same thing can be said for consumer products if reliability methods are applied during the design phase of the development program.  Reliability methodology will provide products that “fail safe”, if they fail at all.  Component failures are not uncommon to any assembly of parts but how that component fails can mean the difference between a product that just won’t work and one that can cause significant injury or even death to the user. It is very interesting to note that German and Japanese companies have put more effort into designing in quality at the product development stage.  U.S. companies seem to place a greater emphasis on solving problems after a product has been developed.  [5]   Engineers in the United States do an excellent job when cost reducing a product through part elimination, standardization, material substitution, etc but sometimes those efforts relegate reliability to the “back burner”.  Producibility, reliability, and quality start with design, at the beginning of the process, and should remain the primary concern throughout product development, testing and manufacturing.

QUALITY VS RELIABILITY:

There seems to be general confusion between quality and reliability.  Quality is the “totality of features and characteristics of a product that bear on its ability to satisfy given needs; fitness for use”.  “Reliability is a design parameter associated with the ability or inability of a product to perform as expected over a period of time”.  It is definitely possible to have a product of considerable quality but one with questionable reliability.  Quality AND reliability are crucial today with the degree of technological sophistification, even in consumer products.  As you well know, the incorporation of computer driven and / or computer-controlled products has exploded over the past two decades.  There is now an engineering discipline called MECHATRONICS that focuses solely on the combining of mechanics, electronics, control engineering and computing.  Mr. Tetsuro Mori, a senior engineer working for a Japanese company called Yaskawa, first coined this term.  The discipline is also alternately referred to as electromechanical systems.  With added complexity comes the very real need to “design in” quality and reliability and to quantify the characteristics of operation, including the failure rate, the “mean time between failure” (MTBF ) and the “mean time to failure” ( MTTF ).  Adequate testing will also indicate what components and subsystems are susceptible to failure under given conditions of use.  This information is critical to marketing, sales, engineering, manufacturing, quality and, of course, the VP of Finance who pays the bills.

Every engineer involved with the design and manufacture of a product should have a basic knowledge of quality and reliability methods and practices.

DEFINITIONS

I think it’s appropriate to define Reliability and Reliability Engineering.  As you will see, there are several definitions, all basically saying the same thing, but important to mention, thereby grounding us for the course to follow.

“Reliability is, after all, engineering in its most practical form.”

James R. Schlesinger

Former Secretary of Defense

“Reliability is a projection of performance over periods of time and is usually defined as a quantifiable design parameter.  Reliability can be formally defined as the probability or likelihood that a product will perform its intended function for a specified interval under stated conditions of use. “

John W. Priest

Engineering Design for Producibility and

Reliability

“ Reliability engineering provides the tools whereby the probability and capability of an item performing intended functions for specified intervals in specified environments without failure can be specified, predicted, designed-in, tested, demonstrated, packaged, transported, stored installed, and started up; and their performance monitored and fed back to all organizations.”

Unknown

“Reliability is the science aimed at predicting, analyzing, preventing and mitigating

failures over time.”

John D. Healy, PhD

“Reliability is —blood, sweat, and tears engineering to find out what could go wrong —, to organize that knowledge so it is useful to engineers and managers, and then to act

on that knowledge”

Ralph A. Evans

“The conditional probability, at a given confidence level, that the equipment

will perform its intended function for a specified mission time when operating

under the specified application and environmental stresses. “

The General Electric Company

“By its most primitive definition, reliability is the probability that no failures will occur in a given time interval of operation.  This time interval may be a single operation, such as a mission, or a number of consecutive operations or missions.  The opposite of reliability is unreliability, which is defined as the probability of failure in the same time interval “.

Igor Bazovsky

“Reliability Theory and Practice”

Personally, I like the definition given by Dr. Healy although the phrase “performing intended functions for specified intervals in specified environments “ adds a reality to the definition that really should be there. Also, there is generally associated with reliability data a confidence level.  We will definitely discuss confidence level later on and how that factors into the reliability process.   Reliability, like all other disciplines, has its own specific vocabulary and understanding “the words” is absolutely critical to the overall process we wish to follow.

DISCUSSION

The main goal of reliability engineering is to minimize failure rate by maximizing MTTF.  The two main goals of design for reliability are:

  • Predict the reliability of an item; i.e. component, subsystem and system ( fit the life model and/or estimate the MTTF or MTBF )
  • Design for environments that promote failure. [10] To do this, we must understand the KNPs and the KCPs of the entire system or at least the mission critical subassemblies of the system.

The overall effort is concerned with eliminating early failures by observing their distribution and determining, accordingly, the length of time necessary for debugging and methods used to debug a system or subsystem.    Further, it is concerned with preventing wearout failures by observing the statistical distribution of wearout and determining the preventative replacement periods for the various parts.  This equates to knowing the MTTF and MTBF.   Finally, its main attention is focused on chance failures and their prevention, reduction or complete elimination because it is the chance failures that most affect equipment reliability in actual operation.  One method of accomplishing the above two goals is by the development and refinement of mathematical models.   These models, properly structured, define and quantify the operation and usage of components and systems.

PROCESS:

No mechanical or electromechanical product will last forever without preventative maintenance and / or replacing critical components.  Reliability engineering seeks to discover the weakest link in the system or subsystem so any eventual product failure may be predicted and consequently forestalled.   Any operational interruption may be eliminated by periodically replacing a part or an assembly of parts prior to failure.  This predictive ability is achieved by knowing the meantime to failure (MTTF) and the meantime between failures (MTBF) for “mission critical” components and assemblies.   With this knowledge, we can provide for continuous and safe operation, relative to a given set of environmental conditions and proper usage of the equipment itself.  The test, find, fix (TAAF of TAAR) approach is used throughout reliability testing to discover what components are candidates for continuous “preventative maintenance” and possibly ultimate replacement.  Sometimes designing redundancy into a system can prolong the operational life of a subsystem or system but that is generally costly for consumer products.  Usually, this is only done when the product absolutely must survive the most rigorous environmental conditions and circumstances.  Most consumer products do not have redundant systems.   Airplanes, medical equipment and aerospace equipment represent products that must have redundant systems for the sake of continued safety for those using the equipment.  As mentioned earlier, at the very worst, we ALWAYS want our mechanism to “fail safe” with absolutely no harm to the end-user or other equipment.  This can be accomplished through engineering design and a strong adherence to accepted reliability practices.  With this in mind, we start this process by recommending the following steps:

  • Establish reliability goals and allocate reliability targets.
  • Develop functional block diagrams for all critical systems
  • Construct P-diagrams to identify and define KCPs and KNPs
  • Benchmark current designs
  • Identify the mission critical subsystems and components
  • Conduct FMEAs
  • Define and execute pre-production life tests; i.e. growth testing
  • Conduct life predictions
  • Develop and execute reliability audit plans

It is appropriate to mention now that this document assumes the product design is, at least, in the design confirmation phase of the development cycle and we have been given approval to proceed.  Most NPI methodologies carry a product though design guidance, design confirmation, pre-pilot, pilot and production phases.  Generally, at the pre-pilot point, the design is solidified so that evaluation and reliability testing can be conducted with assurance that any and all changes will be fairly minor and will not involve a “wholesale” redesign of any component or subassembly.  This is not to say that when “mission critical components” fail we do not make all efforts to correct the failure(s) and put the product back into reliability testing.  At the pre-pilot phase, the market surveys, consumer focus studies and all of the QFD work have been accomplished and we have tentative specifications for our product.  Initial prototypes have been constructed and upper management has “signed off” and given approval to proceed into the next development cycles of the project.  ONE CAUTION:  Any issues involving safety of use must be addressed regardless of any changes becoming necessary for an adequate “fix”.  This is imperative and must occur if failures arise, no matter what phase of the program is in progress.

Critical to these efforts will be conducting HALT and HAST testing to “make the product fail”.  This will involve DOE (Design of Experiments) planning to quantify AND verify FMEA estimates. Significant time may be saved by carefully structuring a reliability evaluation plan to be accomplished at the component, subsystem and system levels.  If you couple these tests with appropriate field-testing, you will develop a product that will “go the distance” relative to your goals and stay well within your SCR (Service Call Rate) requirements.  Reliability testing must be an integral part of the basic design process and time must be given to this effort.  The NPI process always includes reliability testing and the assessment of those results from that testing.  Invariability, some degree of component or subsystem redesign results from HALT or HAST because weaknesses are made known that can and will be eliminated by redesign.  In times past, engineering effort has always been to assign a “safety factor” to any design process.  This safety factor takes into consideration “unknowns” that may affect the basic design.  Unfortunately, this may produce a design that is structurally robust but fails due to Key Noise Parameters (KNPs) or Key Control Parameters (KCPs).

SUMMARY

As you might expect, this is a “lick and a promise” relative to the subject of reliability.  It’s a very complex subject but one that has provided remarkable life and quality to consumer and commercial products.   I would invite you to take a look at the literature and further your understanding of the “ins and outs” of the technology.  As always, I welcome your comments.