C.T.E.

September 22, 2017


Portions of this post are taken from the New York Times, By KEN BELSON SEPT. 21, 2017.

There has been a great deal of discussion in this country about the effect of “impact sports” on cogitative ability.  From the NYTs article highlighted below, you can see the possible implications from repetitive concussions received during a very short time in the NFL.

The brain scan came as a surprise even to researchers who for years have been studying the relationship between brain disease and deaths of professional football players.

Aaron Hernandez, the former New England Patriots tight end and a convicted murderer, was 27 when he committed suicide in April. Yet a posthumous examination of his brain showed he had such a severe form of the degenerative brain disease C.T.E. that the damage was akin to that of players well into their 60s. 

C.T.E., or chronic traumatic encephalopathy, has been found in more than one hundred (100) former N.F.L. players, some of whom committed suicide, according to researchers at Boston University.

Yet the results of the study of Mr. Hernandez’s brain are adding another dimension to his meteoric rise and fall that could raise questions about the root of his erratic, violent behavior and lead to a potentially tangled legal fight with the N.F.L., the most powerful sports league in the United States.

WHAT IS C.T.E.
Chronic Traumatic Encephalopathy (CTE) is a degenerative brain disease found in athletes, military veterans, and others with a history of repetitive brain trauma. In CTE, a protein called Tau forms clumps that slowly spread throughout the brain, killing brain cells. CTE has been seen in people as young as seventeen (17) years of age, but symptoms do not generally begin appearing until years after the onset of head impacts.  If a picture is worth a thousand words, we can see the effects of CTE with the image below:

As you can certainly see, there is a tremendous difference between the appearance of a healthy brain on the left and a brain ravaged by CTE on the right.

Early symptoms of CTE usually appear in a patient’s late twenties (20s) or thirties (30s), and affect a patient’s mood and behavior. Some common changes seen include impulse control problems, aggression, depression, and paranoia.

As the disease progresses, some patients may experience problems with thinking and memory, including memory loss, confusion, impaired judgment, and eventually progressive dementia. Cognitive symptoms tend to appear later than mood and behavioral symptoms, and generally first appear in a patient’s forties (40s) or fifties (50s). Patients may exhibit one or both symptom clusters. In some cases, symptoms worsen with time (even if the patient suffers no additional head impacts). In other cases, symptoms may be stable for years before worsening.

The best available evidence tells us that CTE is caused by repetitive hits to the head sustained over a period of years. This doesn’t mean a handful of concussions: most people diagnosed with CTE suffered hundreds or thousands of head impacts over the course of many years playing contact sports or serving in the military. And it’s not just concussions: the best available evidence points towards sub-concussive impacts, or hits to the head that don’t cause full-blown concussions, as the biggest factor. With that being the case, just who is at risk.  The chart below will give some idea.

SYMPTOMS OF C.T.E.:

Early symptoms of CTE usually appear in a patient’s late twenties (20s) or thirties (30s), and affect a patient’s mood and behavior. Some common changes seen include impulse control problems, aggression, depression, and paranoia. A short list is as follows:

  • Difficulty thinking (cognitive impairment). This might be in the form of confusion or significant delays in taking action.
  • Impulsive behavior. This impulsive behavior is generally “new” to the individual and does not represent normal behavior
  • Depression or apathy.
  • Short-term memory loss. This is continuous short-term memory loss and much more significant that forgetfulness.
  • Difficulty planning and carrying out tasks (executive function)
  • Emotional instability. Emotional instability and impulsive behavior and different reactions to a set of circumstances. You may look at the clinical differences.
  • Substance abuse.
  • Suicidal thoughts or behavior. This is exactly what happened to Aaron Hernandez.  CTE and being locked up 24/7 probably caused feelings of hopelessness.

CONCLUSIONS:

I remember as a kid just about getting down on one knee asking my mom to allow me to play football.  There was a real battle in our house over that.  I was instructed to bring home the equipment I drew from the football inventory so mom and dad could take a look.   We immediately went to Martin-Thompson Sporting Goods to buy me a new helmet with a proper face mask.  Even back in the early sixties head trauma was an issue and every parent knew what could happen.  Equipment improves but so does the size of the players.  STILL A PROBLEM.

Advertisements

MULTITASKING

September 14, 2017


THE DEFINITION:

“Multitasking, in a human context, is the practice of doing multiple things simultaneously, such as editing a document or responding to email while attending a teleconference.”

THE PROCESS:

The concept of multitasking began in a computing context. Computer multitasking, similarly to human multitasking, refers to performing multiple tasks at the same time. In a computer, multitasking refers to things like running more than one application simultaneously.   Modern-day computers are designed for multitasking. For humans, however, multitasking has been decisively proven to be an ineffective way to work. Research going back to the 1980s has indicated repeatedly that performance suffers when people multitask.

REALITY:

Multitasking is not a natural human trait.  In a few hundred years, natural evolution may improve human abilities but for now, we are just not good at it.  In 2007, an ABC Evening News broadcast cited, “People are interrupted once every ten and one-half minutes (10.5).  It takes twenty-three (23) minutes to regain your train of thought.  People lose two point one (2.1) hours each day in the process of multitasking.”

A great article entitled “No Task Left Behind” by Mark Gloria, indicated that a person juggled twelve (12) work spheres each day and fifty-seven percent (57%) of the work got interrupted.  As a result, twenty-three percent (23%) of the work to be accomplished that day got pushed to the next day and beyond. That was the case twelve years ago.  We all have been there trying to get the most of each day only to return home with frustration and more to do the next day.

Experience tells us that:

  • For students, an increase in multitasking predicted poorer academic results.
  • Multitaskers took longer to complete tasks and produced more errors.
  • People had more difficulty retaining new information while multitasking.
  • When tasks involved making selections or producing actions, even very simple tasks performed concurrently were impaired.
  • Multitaskers lost a significant amount of time switching back and forth between tasks, reducing their productivity up to forty percent (40%).
  • Habitual multitaskers were less effective than non-multitaskers even when doing one task at any given time because their ability to focus was impaired.
  • Multitasking temporarily causes an IQ drop of 10 points, the equivalent of going without sleep for a full night.
  • Multitaskers typically think they are more effective than is actually the case.
  • There are limited amounts of energy for any one given day.
  • Multitasking can lessen inter-personal skills and actually detract from the total work force.
  • It encourages procrastination.
  • A distracted mind may become permanent.

THE MYTH OF MULTITASKING:

People believe multitasking is a positive attribute, one to be admired. But multitasking is simply the lack of self-discipline. Multitasking is really switching your attention from one to task to another to another, instead of giving yourself over to a single task. Multitasking is easy; disciplined focus and attention is difficult.

The quality of your work is determined by how much of your time, your focus and your attention you give it. While multitasking feels good and feels busy, the quality of the work is never what it could be with the creator’s full attention. More and more, this is going to be apparent to those who are judging the work, especially when compared to work of someone who is disciplined and who has given the same or similar project their full focus and attention.

MENTAL FLOW:

In positive psychology, flow, also known as the zone, is the mental state of operation in which a person performing an activity is fully immersed in a feeling of energized focus, full involvement, and enjoyment in the process of the activity.

The individual who coined the phrase “flow” was Mihaly Csikszentmihalyi. (Please do NOT ask me to pronounce Dr. Csikszentmihalyi’s last name.)  He made the following statement:

“The best moments in our lives are not the passive, receptive, relaxing times… The best moments usually occur if a person’s body or mind is stretched to its limits in a voluntary effort to accomplish something difficult and worthwhile.”

– Mihaly Csikszentmihalyi  

EIGHT CHARACTERISTICS OF “FLOW”:

  1. Complete concentration on the task.  By this we mean really complete.
  2. Clarity of goals and reward in mind and immediate feedback. No need to focus and concentrate when there are no goals in mind to indicate completion.
  3. Transformation of time (speeding up/slowing down of time). When in full “flow” mode, you lost time.
  4. The experience is intrinsically rewarding, has an end itself.
  5. Effortlessness and ease.
  6. There is a balance between challenge and skills.
  7. Actions and awareness are merged, losing self-conscious rumination.
  8. There is a feeling of control over the task.

I personally do not get there often but the point is—you cannot get in the “zone”, you will not be able to achieve mental “flow” when you are in the multitasking mode.  I just will not happen.

As always, I welcome your comments.

V2V TECHNOLOGY

September 9, 2017


You probably know this by now if you read my postings—my wife and I love to go to the movies.  I said GO TO THE MOVIES, not download movies but GO.  If you go to a matinée, and if you are senior, you get a reduced rate.  We do that. Normally a movie beginning at 4:00 P.M. will get you out by 6:00 or 6:30 P.M. Just in time for dinner. Coming from the Carmike Cinema on South Terrace, I looked left and slowly moved over to the inside lane—just in time to hit car in my “blind side”.  Low impact “touching” but never the less an accident anyway.  All cars, I’m told, have blind sides and ours certainly does.  Side mirrors do NOT cover all areas to the left and right of any vehicle.   Maybe there is a looming solution to that dilemma.

V2V:

The global automotive industry seems poised and on the brink of a “Brave New World” in which connectivity and sensor technologies come together to create systems that can eliminate life-threatening collisions and enable automobiles that drive themselves.  Knows as Cooperative Intelligent Transportation Systems, vehicle-to-vehicle or V2V technologies open the door for automobiles to share information and interact with each other, as well as emerging smart infrastructure. These systems, obviously, make transportation safer but offer the promise of reducing traffic congestion.

Smart features of V2V promise to enhance drive awareness via traffic alerts, providing notifications on congestion, obstacles, lane changing, traffic merging and railway crossing alerts.  Additional applications include:

  • Blind spot warnings
  • Forward collision warnings
  • Sudden brake-ahead warnings
  • Approaching emergency vehicle warnings
  • Rollover warnings
  • Travel condition data to improve maintenance services.

Already The Department of Transportation “Vehicle-to-Vehicle Communications: Readiness of V2V Technology for Application”, DOT HS 812 014, details the technology as follows:

“The purpose of this research report is to assess the readiness for application of vehicle-to-vehicle (V2V) communications, a system designed to transmit basic safety information between vehicles to facilitate warnings to drivers concerning impending crashes. The United States Department of Transportation and NHTSA have been conducting research on this technology for more than a decade. This report explores technical, legal, and policy issues relevant to V2V, analyzing the research conducted thus far, the technological solutions available for addressing the safety problems identified by the agency, the policy implications of those technological solutions, legal authority and legal issues such as liability and privacy. Using this report and other available information, decision-makers will determine how to proceed with additional activities involving vehicle-to-vehicle (V2V), vehicle-to-infrastructure (V2I), and vehicle-to-pedestrian (V2P) technologies.”

The agency estimates there are approximately five (5) million annual vehicle crashes, with attendant property damage, injuries, and fatalities. While it may seem obvious, if technology can help drivers avoid crashes, the damage due to crashes simply never occurs.  This is the intent of an operative V2V automotive system. While these “vehicle-resident” crash avoidance technologies can be highly beneficial, V2V communications represent an additional step in helping to warn drivers about impending danger. V2V communications use on-board dedicated short-range radio communication devices to transmit messages about a vehicle’s speed, heading, brake status, and other information to other vehicles and receive the same information from the messages, with range and “line-of-sight” capabilities that exceed current and near-term “vehicle-resident” systems — in some cases, nearly twice the range. This longer detection distance and ability to “see” around corners or “through” other vehicles and helps V2V-equipped vehicles perceive some threats sooner than sensors, cameras, or radar.  This can warn drivers accordingly. V2V technology can also be fused with those vehicle-resident technologies to provide even greater benefits than either approach alone. V2V can augment vehicle-resident systems by acting as a complete system, extending the ability of the overall safety system to address other crash scenarios not covered by V2V communications, such as lane and road departure. A fused system could also augment system accuracy, potentially leading to improved warning timing and reducing the number of false warnings.

Communications represent the keystone of V2V systems.  The current technology builds upon a wireless standard called Dedicated Shor- Range Communication or DSRC.  DSRC is based upon the IEEE 802.11p protocol.  Transmissions of these systems consists of highly secure, short-to-medium-range, high-speed wireless communication channels, which enable vehicles to connect with each other for short periods of time.  Using DSRC, two or more vehicles can exchange basic safety messages, which describe each vehicle’s speed, position, heading, acceleration rate, size and braking status.  The system sends these messages to the onboard units of surrounding vehicles ten (10) times per second, where they are interpreted and provide warnings to the driver.  To achieve this, V2V systems leverage telematics to track vehicles via GPS monitoring the location, movements, behavior and status of each vehicle.

Based on preliminary information, NHTSA currently estimates that the V2V equipment and supporting communications functions (including a security management system) would cost approximately $341 to $350 per vehicle in 2020 dollars. It is possible that the cost could decrease to approximately $209 to $227 by 2058, as manufacturers gain experience producing this equipment (the learning curve). These costs would also include an additional $9 to $18 per year in fuel costs due to added vehicle weight from the V2V system. Estimated costs for the security management system range from $1 to $6 per vehicle, and they will increase over time due to the need to support an increasing number of vehicles with the V2V technologies. The communications costs range from $3 to $13 per vehicle. Cost estimates are not expected to change significantly by the inclusion of V2V-based safety applications, since the applications themselves are software and their costs are negligible.  Based on preliminary estimates, the total projected preliminary annual costs of the V2V system fluctuate year after year but generally show a declining trend. The estimated total annual costs range from $0.3 to $2.1 billion in 2020 with the specific costs being dependent upon the technology implementation scenarios and discount rates. The costs peak to $1.1 to $6.4 billion between 2022 and 2024, and then they gradually decrease to $1.1 to $4.6 billion.

In terms of safety impacts, the agency estimates annually that just two of many possible V2V safety applications, IMA (Integrated Motor Assists) and LTA (Land Transport Authority), would on an annual basis potentially prevent 25,000 to 592,000 crashes, save 49 to 1,083 lives, avoid 11,000 to 270,000 MAIS 1-5 injuries, and reduce 31,000 to 728,000 property-damage-only crashes by the time V2V technology had spread through the entire fleet. We chose those two applications for analysis at this stage because they are good illustrations of benefits that V2V can provide above and beyond the safety benefits of vehicle-resident cameras and sensors. Of course, the number of lives potentially saved would likely increase significantly with the implementation of additional V2V and V2I safety applications that would be enabled if vehicles were equipped with DSRC capability.

CONCLUSIONS: 

It is apparent to me that we are driving (pardon the pun) towards self-driving automobiles. I have no idea as to when this technology will become fully adopted, if ever.  If that happens in part or across the vehicle spectrum, there will need to be some form of V2V. One car definitely needs to know where other cars are relative to position, speed, acceleration, and overall movement. My wife NEVER goes to sleep or naps while I’m driving—OK maybe one time as mentioned previously.  She is always remarkably attentive and aware when I’m behind the wheel.  This comes from experience gained over fifty-two years of marriage.  “The times they are a-changing”.   The great concern I have is how we are to maintain the systems and how “hackable” they may become.  As I awoke this morning, I read the following:

The credit reporting agency Equifax said Thursday that hackers gained access to sensitive personal data — Social Security numbers, birth dates and home addresses — for up to 143 million Americans, a major cybersecurity breach at a firm that serves as one of the three major clearinghouses for Americans’ credit histories.

I am sure, like me, that gives you pause.  If hackers can do that, just think about the chaos that can occur if V2V systems can be accessed and controlled.  Talk about keeping one up at night.

As always, I welcome your comments.

AN AVERAGE DAY FOR DATA

August 4, 2017


I am sure you have heard the phrase “big data” and possibly wondered just what that terminology relates to.  Let’s get the “official” definition, as follows:

The amount of data that’s being created and stored on a global level is almost inconceivable, and it just keeps growing. That means there’s even more potential to glean key insights from business information – yet only a small percentage of data is actually analyzed. What does that mean for businesses? How can they make better use of the raw information that flows into their organizations every day?

The concept gained momentum in the early 2000s when industry analyst Doug Laney articulated the now-mainstream definition of big data as the four plus complexity:

  • Organizations collect data from a variety of sources, including business transactions, social media and information from sensor or machine-to-machine data. In the past, storing it would’ve been a problem – but new technologies (such as Hadoop) have eased the burden.
  • Data streams in at an unprecedented speed and must be dealt with in a timely manner. RFID tags, sensors and smart metering are driving the need to deal with torrents of data in near-real time.
  • Data comes in all types of formats – from structured, numeric data in traditional databases to unstructured text documents, email, video, audio, stock ticker data and financial transactions.
  • In addition to the increasing velocities and varieties of data, data flows can be highly inconsistent with periodic peaks. Is something trending in social media? Daily, seasonal and event-triggered peak data loads can be challenging to manage. Even more so with unstructured data.
  • Today’s data comes from multiple sources, which makes it difficult to link, match, cleanse and transform data across systems. However, it’s necessary to connect and correlate relationships, hierarchies and multiple data linkages or your data can quickly spiral out of control.

AN AVERAGE DAY IN THE LIFE OF BIG DATA:

I picture is worth a thousand words but let us now quantify, on a daily basis, what we mean by big data.

  • U-Tube’s viewers are watching a billion (1,000,000,000) hours of videos each day.
  • We perform over forty thousand (40,000) searches per second on Google alone. That is approximately three and one-half (3.5) billion searches per day and roughly one point two (1.2) trillion searches per year, world-wide.
  • Five years ago, IBM estimated two point five (2.5) exabytes (2.5 billion gigabytes of data generated every day. It has grown since then.
  • The number of e-mail sent per day is around 269 billion. That is about seventy-four (74) trillion e-mails per year. Globally, the data stored in data centers will quintuple by 2020 to reach 915 exabytes.  This is up 5.3-fold with a compound annual growth rate (CAGR) of forty percent (40%) from 171 exabytes in 2015.
  • On average, an autonomous car will churn out 4 TB of data per day, when factoring in cameras, radar, sonar, GPS and LIDAR. That is just for one hour per day.  Every autonomous car will generate the data equivalent to almost 3,000 people.
  • By 2024, mobile networks will see machine-to-machine (M2M) connections jump ten-fold to 2.3 billion from 250 million in 2014, this is according to Machina Research.
  • The data collected by BMW’s current fleet of 40 prototype autonomous care during a single test session would fill the equivalent stack of CDs 60 miles high.

We have become a world that lives “by the numbers” and I’m not too sure that’s altogether troubling.  At no time in our history have we had access to data that informs, miss-informs, directs, challenges, etc etc as we have at this time.  How we use that data makes all the difference in our daily lives.  I have a great friend named Joe McGuinness. His favorite expressions: “It’s about time we learn to separate the fly s_____t from the pepper.  If we apply this phrase to big data, he may just be correct. Be careful out there.


The publication EfficientGov indicates the following: “The opioid crisis is creating a workforce epidemic leading to labor shortage and workplace safety and performance challenges.”

Opioid-related deaths have reached an all-time high in the United States. More than 47,000 people died in 2014, and the numbers are rising. The Centers for Disease Control and Prevention this month released prescribing guidelines to help primary care physicians safely treat chronic pain while reducing opioid dependency and abuse. Given that the guidelines are not binding, how will the CDC and the Department of Health and Human Services make sure they make a difference? What can payers and providers do to encourage a countrywide culture shift?

The opioid epidemic is also having widespread effects on many industries relative to labor shortages, workplace safety and worker performance.  Managers and owners are trying to figure out methods to deal with drug-addicted workers and job applicants.  HR managers cite the opioid crisis as one of their biggest challenges. Applicants are unwilling or unable to pass drug tests, employees are increasingly showing signs of addiction on the job and there are workers with opioid prescriptions having significant performance problems.

Let’s take a very quick look at only three employers and what they say about the crisis.

  • Clyde McClellan used to require a drug test before people could work at his Ohio pottery company, which produces 2,500 hand-cast coffee mugs a day for Starbucks and others. Now, he skips the tests and finds it more efficient to flat-out ask applicants: “What are you on?”
  • At Homer Laughlin China, a company that makes a colorful line of dishware known as Fiesta and employs 850 at a sprawling complex in Newell, W.V., up to half of applicants either fail or refuse to take mandatory pre-employment drug screens, said company president Liz McIlvain. “The drugs are so cheap and they’re so easily accessible,” McIlvain, a fourth-generation owner of the company, said. “We have a horrible problem here.”
  • “That is really the battlefield for us right now,” said Markus Dietrich,global manager of employee assistance and work-life services at chemical giant DuPont, which employs 46,000 worldwide.

As you might suspect, the epidemic is having a devastating effect on companies — large and small — and their ability to stay competitive. Managers and owners across the country are at a loss in how to deal with addicted workers and potential workers, calling the issue one of the biggest problems they face. Applicants are increasingly unwilling or unable to pass drug tests; then there are those who pass only to show signs of addiction once employed. Even more confounding: how to respond to employees who have a legitimate prescription for opioids but whose performance slips.  There are those individuals who have a need for pain-killers and to deny them would be difficult, but how do you deal with this if you are a manager and fear issues and potential law suites when there is over use?

The issue is amplifying labor shortages in industries like trucking, which has had difficulty for the last six (6) years finding qualified workers and drivers.  It is also pushing employers to broaden their job searches, recruiting people from greater distances when roles can’t be filled with local workers. At stake is not only safety and productivity within companies — but the need for humans altogether, with some manufacturers claiming opioids force them to automate work faster.

One corporate manager said: “You’re going to see manufacturing jobs slowly going away for, if nothing else, that reason alone.   “It’s getting worse, not better.”

Economists have noticed also. In Congressional testimony earlier this month, Federal Reserve chair Janet Yellen related opioid use to a decline in the labor participation rate. The past three Fed surveys on the economy, known as the Beige Book, explicitly mentioned employers’ struggles in finding applicants to pass drug tests as a barrier to hiring. The surveys, snapshots of economic conditions in the Fed’s twelve (12) districts, don’t mention the type of drugs used.   A Congressional hearing in June of this year focused on opioids and their economic consequences, Ohio attorney general Mike DeWine estimated that forty (40) percent of applicants in the state either failed or refused a drug test. This prevents people from operating machinery, driving a truck or getting a job managing a McDonald’s, he said.

OK, what should a manufacturer do to lessen or hopefully eliminate the problem?  There have been put forth several suggestions, as follows:

Policy Option 1: Medical Education– Opioid education is crucial at all levels, from medical school and residency, through continuing education; and must involve primary care, specialists, mental health providers, pharmacies, emergency departments, clinics and patients. The push to increase opioid education must come from medical schools, academic medical centers, accrediting organizations and possibly state legislatures.

Policy Option 2: Continuing Medical Education– Emphasize the importance of continuing medical education (CME) for practicing physicians. CME can be strengthened by incorporating the new CDC guidelines, and physicians should learn when and how to safely prescribe these drugs and how to handle patients with drug-seeking behavior.

Policy Option 3: Public Education– Emphasize the need to address patient demand, not just physician supply, for opioids. It compared the necessary education to the campaign to reduce demand for antibiotics. The public needs to learn about the harms as well as the benefits of these powerful painkillers, and patients must understand that their pain can be treated with less-dangerous medications, or nonpharmacological interventions like physical therapy or acupuncture. Such education could be spearheaded by various physician associations and advocacy groups, with support from government agencies and officials at HHS and elsewhere.

Policy Option 4: Removing Perverse Incentives and Payment Barriers– Prescribing decisions are influenced by patient satisfaction surveys and insurance reimbursement practices, participants said. Patient satisfaction surveys are perceived — not necessarily accurately — as making it harder for physicians to say “no” to patients who are seeking opioids. Long-standing insurance practices, such as allowing only one pain prescription to be filled a month, are also encouraging doctors to prescribe more pills than a patient is likely to need — adding to the risk of overuse, as well as chance of theft, sale or other diversion of leftover drugs.

Policy Option 5: Solutions through Technology– Prescription Drug Monitoring Programs (PDMP) and Electronic Health Records (EHR) could be important tools in preventing opioid addiction, but several barriers stand in the way. The PDMP data are incomplete; for instance, a physician in Washington, D.C., can’t see whether a patient is also obtaining drugs in Maryland or Virginia. The records are not user friendly; and they need to be integrated into EHRs so doctors can access them both — without additional costs piled on by the vendors. It could be helpful if certain guidelines, like defaults for dosing and prescribing, were baked into the electronic records.

Policy Option 6: Access to addiction treatment and reducing stigma—There is a need to change how the country thinks about — and talks about — addiction and mental illness. Substance abuse treatment suffers when people with addiction are treated as criminals or deviants. Instead, substance abuse disorder should be treated as an illness, participants recommended. High deductibles in health plans, including Obamacare exchange plans, create another barrier to substance abuse treatment.

CONCLUSIONS:  I don’t really know how we got here but we are a country with a very very “deep bench”.  We know how to do things, so let’s put all of our resources together to solve this very troublesome problem.


Portions of the following post were taken from an article by Rob Spiegel publishing through Design News Daily.

Two former Apple design engineers – Anna Katrina Shedletsky and Samuel Weiss have leveraged machine learning to help brand owners improve their manufacturing lines. The company, Instrumental , uses artificial intelligence (AI) to identify and fix problems with the goal of helping clients ship on time. The AI system consists of camera-equipped inspection stations that allow brand owners to remotely manage product lines at their contact manufacturing facilities with the purpose of maximizing up-time, quality and speed. Their digital photo is shown as follows:

Shedletsky and Weiss took what they learned from years of working with Apple contract manufacturers and put it into AI software.

“The experience with Apple opened our eyes to what was possible. We wanted to build artificial intelligence for manufacturing. The technology had been proven in other industries and could be applied to the manufacturing industry,   it’s part of the evolution of what is happening in manufacturing. The product we offer today solves a very specific need, but it also works toward overall intelligence in manufacturing.”

Shedletsky spent six (6) years working at Apple prior to founding Instrumental with fellow Apple alum, Weiss, who serves Instrumental’s CTO (Chief Technical Officer).  The two took their experience in solving manufacturing problems and created the AI fix. “After spending hundreds of days at manufacturers responsible for millions of Apple products, we gained a deep understanding of the inefficiencies in the new-product development process,” said Shedletsky. “There’s no going back, robotics and automation have already changed manufacturing. Intelligence like the kind we are building will change it again. We can radically improve how companies make products.”

There are number examples of big and small companies with problems that prevent them from shipping products on time. Delays are expensive and can cause the loss of a sale. One day of delay at a start-up could cost $10,000 in sales. For a large company, the cost could be millions. “There are hundreds of issues that need to be found and solved. They are difficult and they have to be solved one at a time,” said Shedletsky. “You can get on a plane, go to a factory and look at failure analysis so you can see why you have problems. Or, you can reduce the amount of time needed to identify and fix the problems by analyzing them remotely, using a combo of hardware and software.”

Instrumental combines hardware and software that takes images of each unit at key states of assembly on the line. The system then makes those images remotely searchable and comparable in order for the brand owner to learn and react to assembly line data. Engineers can then take action on issues. “The station goes onto the assembly line in China,” said Shedletsky. “We get the data into the cloud to discover issues the contract manufacturer doesn’t know they have. With the data, you can do failure analysis and reduced the time it takes to find an issue and correct it.”

WHAT IS AI:

Artificial intelligence (AI) is intelligence exhibited by machines.  In computer science, the field of AI research defines itself as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of success at some goal.   Colloquially, the term “artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving”.

As machines become increasingly capable, mental facilities once thought to require intelligence are removed from the definition. For instance, optical character recognition is no longer perceived as an example of “artificial intelligence”, having become a routine technology.  Capabilities currently classified as AI include successfully understanding human speech,  competing at a high level in strategic game systems (such as chess and Go), autonomous cars, intelligent routing in content delivery networks, military simulations, and interpreting complex data.

FUTURE:

Some would have you believe that AI IS the future and we will succumb to the “Rise of the Machines”.  I’m not so melodramatic.  I feel AI has progressed and will progress to the point where great time saving and reduction in labor may be realized.   Anna Katrina Shedletsky and Samuel Weiss realize the potential and feel there will be no going back from this disruptive technology.   Moving AI to the factory floor will produce great benefits to manufacturing and other commercial enterprises.   There is also a significant possibility that job creation will occur as a result.  All is not doom and gloom.


Various definitions of product lifecycle management or PLM have been issued over the years but basically: product lifecycle management is the process of managing the entire lifecycle of a product from inception, through engineering design and manufacture, to service and disposal of manufactured products.  PLM integrates people, data, processes and business systems and provides a product information backbone for companies and their extended enterprise.

“In recent years, great emphasis has been put on disposal of a product after its service life has been met.  How to get rid of a product or component is extremely important. Disposal methodology is covered by RoHS standards for the European Community.  If you sell into the EU, you will have to designate proper disposal.  Dumping in a landfill is no longer appropriate.

Since this course deals with the application of PLM to industry, we will now look at various industry definitions.

Industry Definitions

PLM is a strategic business approach that applies a consistent set of business solutions in support of the collaborative creation, management, dissemination, and use of product definition information across the extended enterprise, and spanning from product concept to end of life integrating people, processes, business systems, and information. PLM forms the product information backbone for a company and its extended enterprise.” Source:  CIMdata

“Product life cycle management or PLM is an all-encompassing approach for innovation, new product development and introduction (NPDI) and product information management from initial idea to the end of life.  PLM Systems is an enabling technology for PLM integrating people, data, processes, and business systems and providing a product information backbone for companies and their extended enterprise.” Source:  PLM Technology Guide

“The core of PLM (product life cycle management) is in the creation and central management of all product data and the technology used to access this information and knowledge. PLM as a discipline emerged from tools such as CAD, CAM and PDM, but can be viewed as the integration of these tools with methods, people and the processes through all stages of a product’s life.” Source:  Wikipedia article on Product Lifecycle Management

“Product life cycle management is the process of managing product-related design, production and maintenance information. PLM may also serve as the central repository for secondary information, such as vendor application notes, catalogs, customer feedback, marketing plans, archived project schedules, and other information acquired over the product’s life.” Source:  Product Lifecycle Management

“It is important to note that PLM is not a definition of a piece, or pieces, of technology. It is a definition of a business approach to solving the problem of managing the complete set of product definition information-creating that information, managing it through its life, and disseminating and using it throughout the lifecycle of the product. PLM is not just a technology, but is an approach in which processes are as important, or more important than data.” Source:  CIMdata

“PLM or Product Life Cycle Management is a process or system used to manage the data and design process associated with the life of a product from its conception and envisioning through its manufacture, to its retirement and disposal. PLM manages data, people, business processes, manufacturing processes, and anything else pertaining to a product. A PLM system acts as a central information hub for everyone associated with a given product, so a well-managed PLM system can streamline product development and facilitate easier communication among those working on/with a product. Source:  Aras

A pictorial representation of PLM may be seen as follows:

Hopefully, you can see that PLM deals with methodologies from “white napkin design to landfill disposal”.  Please note, documentation is critical to all aspects of PLM and good document production, storage and retrieval is extremely important to the overall process.  We are talking about CAD, CAM, CAE, DFSS, laboratory testing notes, etc.  In other words, “the whole nine yards of product life”.   If you work in a company with ISO certification, PLM is a great method to insure retaining that certification.

In looking at the four stages of a products lifecycle, we see the following:

Four Stages of Product Life Cycle—Marketing and Sales:

Introduction: When the product is brought into the market. In this stage, there’s heavy marketing activity, product promotion and the product is put into limited outlets in a few channels for distribution. Sales take off slowly in this stage. The need is to create awareness, not profits.

The second stage is growth. In this stage, sales take off, the market knows of the product; other companies are attracted, profits begin to come in and market shares stabilize.

The third stage is maturity, where sales grow at slowing rates and finally stabilize. In this stage, products get differentiated, price wars and sales promotion become common and a few weaker players exit.

The fourth stage is decline. Here, sales drop, as consumers may have changed, the product is no longer relevant or useful. Price wars continue, several products are withdrawn and cost control becomes the way out for most products in this stage.

Benefits of PLM Relative to the Four Stages of Product Life:

Considering the benefits of Product Lifecycle Management, we realize the following:

  • Reduced time to market
  • Increase full price sales
  • Improved product quality and reliability
  • Reduced prototypingcosts
  • More accurate and timely request for quote generation
  • Ability to quickly identify potential sales opportunities and revenue contributions
  • Savings through the re-use of original data
  • frameworkfor product optimization
  • Reduced waste
  • Savings through the complete integration of engineering workflows
  • Documentation that can assist in proving compliance for RoHSor Title 21 CFR Part 11
  • Ability to provide contract manufacturers with access to a centralized product record
  • Seasonal fluctuation management
  • Improved forecasting to reduce material costs
  • Maximize supply chain collaboration
  • Allowing for much better “troubleshooting” when field problems arise. This is accomplished by laboratory testing and reliability testing documentation.

PLM considers not only the four stages of a product’s lifecycle but all of the work prior to marketing and sales AND disposal after the product is removed from commercialization.   With this in mind, why is PLM a necessary business technique today?  Because increases in technology, manpower and specialization of departments, PLM was needed to integrate all activity toward the design, manufacturing and support of the product. Back in the late 1960s when the F-15 Eagle was conceived and developed, almost all manufacturing and design processes were done by hand.  Blueprints or drawings needed to make the parts for the F15 were created on a piece of paper. No electronics, no emails – all paper for documents. This caused a lack of efficiency in design and manufacturing compared to today’s technology.  OK, another example of today’s technology and the application of PLM.

If we look at the processes for Boeings DREAMLINER, we see the 787 Dreamliner has about 2.3 million parts per airplane.  Development and production of the 787 has involved a large-scale collaboration with numerous suppliers worldwide. They include everything from “fasten seatbelt” signs to jet engines and vary in size from small fasteners to large fuselage sections. Some parts are built by Boeing, and others are purchased from supplier partners around the world.  In 2012, Boeing purchased approximately seventy-five (75) percent of its supplier content from U.S. companies. On the 787 program, content from non-U.S. suppliers accounts for about thirty (30) percent of purchased parts and assemblies.  PLM or Boeing’s version of PLM was used to bring about commercialization of the 787 Dreamliner.

 

%d bloggers like this: