AUGMENTED REALITY (AR)

October 13, 2017


Depending on the location, you can ask just about anybody to give a definition of Virtual Reality (VR) and they will take a stab at it. This is because gaming and the entertainment segments of our population have used VR as a new tool to promote games such as SuperHot VR, Rock Band VR, House of the Dying Sun, Minecraft VR, Robo Recall, and others.  If you ask them about Augmented Reality or AR they probably will give you the definition of VR or nothing at all.

Augmented reality, sometimes called Mixed Reality, is a technology that merges real-world objects or the environment with virtual elements generated by sensory input devices for sound, video, graphics, or GPS data.  Unlike VR, which completely replaces the real world with a virtual world, AR operates in real time and is interactive with objects found in the environment, providing an overlaid virtual display over the real one.

While popularized by gaming, AR technology has shown a prowess for bringing an interactive digital world into a person’s perceived real world, where the digital aspect can reveal more information about a real-world object that is seen in reality.  This is basically what AR strives to do.  We are going to take a look at several very real applications of AR to indicate the possibilities of this technology.

  • Augmented Reality has found a home in healthcare aiding preventative measures for professionals to receive information relative to the status of patients. Healthcare giant Cigna recently launched a program called BioBall that uses Microsoft HoloLense technology in an interactive game to test for blood pressure and body mass index or BMI. Patients hold a light, medium-sized ball in their hands in a one-minute race to capture all the images that flash on the screen in front of them. The Bio Ball senses a player’s heartbeat. At the University of Maryland’s Augmentarium virtual and augmented reality laboratory, the school is using AR I healthcare to improve how ultrasound is administered to a patient.  Physicians wearing an AR device can look at both a patient and the ultrasound device while images flash on the “hood” of the AR device itself.
  • AR is opening up new methods to teach young children a variety of subjects they might not be interested in learning or, in some cases, help those who have trouble in class catching up with their peers. The University of Helsinki’s AR program helps struggling kids learn science by enabling them to virtually interact with the molecule movement in gases, gravity, sound waves, and airplane wind physics.   AR creates new types of learning possibilities by transporting “old knowledge” into a new format.
  • Projection-based AR is emerging as a new way to case virtual elements in the real world without the use of bulky headgear or glasses. That is why AR is becoming a very popular alternative for use in the office or during meetings. Startups such as Lampix and Lightform are working on projection-based augmented reality for use in the boardroom, retail displays, hospitality rooms, digital signage, and other applications.
  • In Germany, a company called FleetBoard is in the development phase for application software that tracks logistics for truck drivers to help with the long series of pre-departure checks before setting off cross-country or for local deliveries. The Fleet Board Vehicle Lense app uses a smartphone and software to provide live image recognition to identify the truck’s number plate.  The relevant information is super-imposed in AR, thus speeding up the pre-departure process.
  • Last winter, Delft University of Technology in the Netherlands started working with first responders in using AR as a tool in crime scene investigation. The handheld AR system allows on-scene investigators and remote forensic teams to minimize the potential for site contamination.  This could be extremely helpful in finding traces of DNA, preserving evidence, and getting medical help from an outside source.
  • Sandia National Laboratories is working with AR as a tool to improve security training for users who are protecting vulnerable areas such as nuclear weapons or nuclear materials. The physical security training helps guide users through real-world examples such as theft or sabotage in order to be better prepared when an event takes place.  The training can be accomplished remotely and cheaply using standalone AR headsets.
  • In Finland, the VTT Technical Research Center recently developed an AR tool for the European Space Agency (ESA) for astronauts to perform real-time equipment monitoring in space. AR prepares astronauts with in-depth practice by coordinating the activities with experts in a mixed-reality situation.
  • The U.S. Daqri International uses computer vision for industrial AR to enable data visualization while working on machinery or in a warehouse. These glasses and headsets from Daqri display project data, tasks that need to be completed and potential problems with machinery or even where an object needs to be placed or repaired.

CONCLUSIONS:

Augmented Reality merges real-world objects with virtual elements generated by sensory input devices to provide great advantages to the user.  No longer is gaming and entertainment the sole objective of its use.  This brings to life a “new normal” for professionals seeking more and better technology to provide solutions to real-world problems.

Advertisements

AMAZING GRACE

October 3, 2017


There are many people responsible for the revolutionary development and commercialization of the modern-day computer.  Just a few of those names are given below.  Many of whom you probably have never heard of.  Let’s take a look.

COMPUTER REVOLUNTARIES:

  • Howard Aiken–Aiken was the original conceptual designer behind the Harvard Mark I computer in 1944.
  • Grace Murray Hopper–Hopper coined the term “debugging” in 1947 after removing an actual moth from a computer. Her ideas about machine-independent programming led to the development of COBOL, one of the first modern programming languages. On top of it all, the Navy destroyer USS Hopper is named after her.
  • Ken Thompson and David Ritchie–These guys invented Unix in 1969, the importance of which CANNOT be overstated. Consider this: your fancy Apple computer relies almost entirely on their work.
  • Doug and Gary Carlson–This team of brothers co-founded Brøderbund Software, a successful gaming company that operated from 1980-1999. In that time, they were responsible for churning out or marketing revolutionary computer games like Myst and Prince of Persia, helping bring computing into the mainstream.
  • Ken and Roberta Williams–This husband and wife team founded On-Line Systems in 1979, which later became Sierra Online. The company was a leader in producing graphical adventure games throughout the advent of personal computing.
  • Seymour Cray–Cray was a supercomputer architect whose computers were the fastest in the world for many decades. He set the standard for modern supercomputing.
  • Marvin Minsky–Minsky was a professor at MIT and oversaw the AI Lab, a hotspot of hacker activity, where he let prominent programmers like Richard Stallman run free. Were it not for his open-mindedness, programming skill, and ability to recognize that important things were taking place, the AI Lab wouldn’t be remembered as the talent incubator that it is.
  • Bob Albrecht–He founded the People’s Computer Company and developed a sincere passion for encouraging children to get involved with computing. He’s responsible for ushering in innumerable new young programmers and is one of the first modern technology evangelists.
  • Steve Dompier–At a time when computer speech was just barely being realized, Dompier made his computer sing. It was a trick he unveiled at the first meeting of the Homebrew Computer Club in 1975.
  • John McCarthy–McCarthy invented Lisp, the second-oldest high-level programming language that’s still in use to this day. He’s also responsible for bringing mathematical logic into the world of artificial intelligence — letting computers “think” by way of math.
  • Doug Engelbart–Engelbart is most noted for inventing the computer mouse in the mid-1960s, but he’s made numerous other contributions to the computing world. He created early GUIs and was even a member of the team that developed the now-ubiquitous hypertext.
  • Ivan Sutherland–Sutherland received the prestigious Turing Award in 1988 for inventing Sketchpad, the predecessor to the type of graphical user interfaces we use every day on our own computers.
  • Tim Paterson–He wrote QDOS, an operating system that he sold to Bill Gates in 1980. Gates rebranded it as MS-DOS, selling it to the point that it became the most widely-used operating system of the day. (How ‘bout them apples.?)
  • Dan Bricklin–He’s “The Father of the Spreadsheet. “Working in 1979 with Bob Frankston, he created VisiCalc, a predecessor to Microsoft Excel. It was the killer app of the time — people were buying computers just to run VisiCalc.
  • Bob Kahn and Vint Cerf–Prolific internet pioneers, these two teamed up to build the Transmission Control Protocol and the Internet Protocol, better known as TCP/IP. These are the fundamental communication technologies at the heart of the Internet.
  • Nicklus Wirth–Wirth designed several programming languages, but is best known for creating Pascal. He won a Turing Award in 1984 for “developing a sequence of innovative computer languages.”

ADMIREL GRACE MURRAY HOPPER:

At this point, I want to highlight Admiral Grace Murray Hopper or “amazing Grace” as she is called in the computer world and the United States Navy.  Admiral Hopper’s picture is shown below.

Born in New York City in 1906, Grace Hopper joined the U.S. Navy during World War II and was assigned to program the Mark I computer. She continued to work in computing after the war, leading the team that created the first computer language compiler, which led to the popular COBOL language. She resumed active naval service at the age of 60, becoming a rear admiral before retiring in 1986. Hopper died in Virginia in 1992.

Born Grace Brewster Murray in New York City on December 9, 1906, Grace Hopper studied math and physics at Vassar College. After graduating from Vassar in 1928, she proceeded to Yale University, where, in 1930, she received a master’s degree in mathematics. That same year, she married Vincent Foster Hopper, becoming Grace Hopper (a name that she kept even after the couple’s 1945 divorce). Starting in 1931, Hopper began teaching at Vassar while also continuing to study at Yale, where she earned a Ph.D. in mathematics in 1934—becoming one of the first few women to earn such a degree.

After the war, Hopper remained with the Navy as a reserve officer. As a research fellow at Harvard, she worked with the Mark II and Mark III computers. She was at Harvard when a moth was found to have shorted out the Mark II, and is sometimes given credit for the invention of the term “computer bug”—though she didn’t actually author the term, she did help popularize it.

Hopper retired from the Naval Reserve in 1966, but her pioneering computer work meant that she was recalled to active duty—at the age of 60—to tackle standardizing communication between different computer languages. She would remain with the Navy for 19 years. When she retired in 1986, at age 79, she was a rear admiral as well as the oldest serving officer in the service.

Saying that she would be “bored stiff” if she stopped working entirely, Hopper took another job post-retirement and stayed in the computer industry for several more years. She was awarded the National Medal of Technology in 1991—becoming the first female individual recipient of the honor. At the age of 85, she died in Arlington, Virginia, on January 1, 1992. She was laid to rest in the Arlington National Cemetery.

CONCLUSIONS:

In 1997, the guided missile destroyer, USS Hopper, was commissioned by the Navy in San Francisco. In 2004, the University of Missouri has honored Hopper with a computer museum on their campus, dubbed “Grace’s Place.” On display are early computers and computer components to educator visitors on the evolution of the technology. In addition to her programming accomplishments, Hopper’s legacy includes encouraging young people to learn how to program. The Grace Hopper Celebration of Women in Computing Conference is a technical conference that encourages women to become part of the world of computing, while the Association for Computing Machinery offers a Grace Murray Hopper Award. Additionally, on her birthday in 2013, Hopper was remembered with a “Google Doodle.”

In 2016, Hopper was posthumously honored with the Presidential Medal of Freedom by Barack Obama.

Who said women could not “do” STEM (Science, Technology, Engineering and Mathematics)?

HACKED OFF

October 2, 2017


Portions of this post are taken from an article by Rob Spiegel of Design News Daily.

You can now anonymously hire a cybercriminal online for as little as six to ten dollars ($6 to $10) per hour, says Rodney Joffe, senior vice president at Neustar, a cybersecurity company. As it becomes easier to engineer such attacks, with costs falling, more businesses are getting targeted. About thirty-two (32) percent of information technology professionals surveyed said DDoS attacks cost their companies $100,000 an hour or more. That percentage is up from thirty (30) percent reported in 2014, according to Neustar’s survey of over 500 high-level IT professionals. The data was released Monday.

Hackers are costing consumers and companies between $375 and $575 billion, annually, according to a study published this past Monday, a number only expected to grow as online information stealing expands with increased Internet use.  This number blows my mind.   I actually had no idea the costs were so great.  Great and increasing.

Online crime is estimated at 0.8 percent of worldwide GDP, with developed countries in regions including North America and Europe losing more than countries in Latin American or Africa, according to the new study published by the Center for Strategic and International Studies and funded by cybersecurity firm McAfee.

That amount rivals the amount of worldwide GDP – 0.9 percent – that is spent on managing the narcotics trade. This difference in costs for developed nations may be due to better accounting or transparency in developed nations, as the cost of online crime can be difficult to measure and some companies do not do disclose when they are hacked for fear of damage to their reputations, the report said.

Cyber attacks have changed in recent years. Gone are the days when relatively benign bedroom hackers entered organizations to show off their skills.  No longer is it a guy in the basement of his or her mom’s home eating Doritos.  Attackers now are often sophisticated criminals who target employees who have access to the organization’s jewels. Instead of using blunt force, these savvy criminals use age-old human fallibility to con unwitting employees into handing over the keys to the vault.  Professional criminals like the crime opportunities they’ve found on the internet. It’s far less dangerous than slinging guns. Cybersecurity is getting worse. Criminal gangs have discovered they can carry out crime more effectively over the internet, and there’s less chance of getting caught.   Hacking individual employees is often the easiest way into a company.  One of the cheapest and most effective ways to target an organization is to target its people. Attackers use psychological tricks that have been used throughout mankind.   Using the internet, con tricks can be carried out on a large scale. The criminals do reconnaissance to find out about targets over email. Then they effectively take advantage of key human traits.

One common attack comes as an email impersonating a CEO or supplier. The email looks like it came from your boss or a regular supplier, but it’s actually targeted to a specific professional in the organization.   The email might say, ‘We’ve acquire a new organization. We need to pay them. We need the company’s bank details, and we need to keep this quiet so it won’t affect our stock price.’ The email will go on to say, ‘We only trust you, and you need to do this immediately.’ The email comes from a criminal, using triggers like flattery, saying, ‘You’re the most trusted individual in the organization.’ The criminals play on authority and create the panic of time pressure. Believe it or not, my consulting company has gotten these messages. The most recent being a hack from Experian.

Even long-term attacks can be launched by using this tactic of a CEO message. “A company in Malaysia received kits purporting to come from the CEO.  The users were told the kit needed to be installed. It took months before the company found out it didn’t come from the CEO at all.

Instead of increased technology, some of the new hackers are deploying the classic con moves, playing against personal foibles. They are taking advantage of those base aspects of human nature and how we’re taught to behave.   We have to make sure we have better awareness. For cybersecurity to be engaging, you have to have an impact.

As well as entering the email stream, hackers are identifying the personal interests of victims on social media. Every kind of media is used for attacks. Social media is used to carry out reconnaissance, to identify targets and learn about them.  Users need to see what attackers can find out about them on Twitter or Facebook. The trick hackers use is to pretend they know the target. Then the get closes through personal interaction on social media. You can look at an organization on Twitter and see who works in finance. Then they take a good look across social platform to find those individuals on social media to see if they go to a class each week or if they traveled to Iceland in 1996.  You can put together a spear-phishing program where you say, Hey I went on this trip with you.

CONCLUSIONS:

The counter-action to personal hacking is education and awareness. The company can identify potential weaknesses and potential targets and then change the vulnerable aspects of the corporate environment.  We have to look at the culture of the organization. Those who are under pressure are targets. They don’t have time to study each email they get. We also have to discourage reliance on email.   Hackers also exploit the culture of fear, where people are punished for their mistakes. Those are the people most in danger. We need to create a culture where if someone makes a mistake, they can immediately come forward. The quicker someone comes forward, the quicker we can deal with it.


In preparation for this post, I asked my fifteen-year old grandson to define product logistics and product supply chain.  He looked at me as though I had just fallen off a turnip truck.  I said you know, how does a manufacturer or producer of products get those products to the customer—the eventual user of the device or commodity.  How does that happen? I really need to go do my homework.  Can I think about this and give you an answer tomorrow?

SUPPLY CHAIN LOGISTICS:

Let’s take a look at Logistics and Supply Chain Management:

“Logistics typically refers to activities that occur within the boundaries of a single organization and Supply Chain refers to networks of companies that work together and coordinate their actions to deliver a product to market. Also, traditional logistics focuses its attention on activities such as procurement, distribution, maintenance, and inventory management. Supply Chain Management (SCM) acknowledges all of traditional logistics and also includes activities such as marketing, new product development, finance, and customer service” – from Essential of Supply Chain Management by Michael Hugos.

“Logistics is about getting the right product, to the right customer, in the right quantity, in the right condition, at the right place, at the right time, and at the right cost (the seven Rs of Logistics)” – from Supply Chain Management: A Logistics Perspective By John J. Coyle et al

Now, that wasn’t so difficult, was it?  A good way to look at is as follows:

MOBILITY AND THE SUPPLY CHAIN:

There have been remarkable advancements in supply chain logistics over the past decade.  Most of those advancements have resulted from companies bringing digital technologies into the front office, the warehouse, and transportation to the eventual customer.   Mobile technologies are certainly changing how products are tracked outside the four walls of the warehouse and the distribution center.  Realtime logistics management is within the grasp of many very savvy shippers.  To be clear:

Mobile networking refers to technology that can support voice and/or data network connectivity using wireless, via a radio transmission solution. The most familiar application of mobile networking is the mobile phone or tablet or i-pad.  From real-time goods tracking to routing assistance to the Internet of Things (IoT) “cutting wires” in the area that lies between the warehouse and the customer’s front door is gaining ground as shippers grapple with fast order fulfillment, smaller order sizes, and ever-evolving customer expectations.

In return for their tech investments, shippers and logistics managers are gaining benefits such as short-ended lead times, improved supply chain visibility, error reductions, optimized transportation networks and better inventory management.  If we combine these advantages we see that “wireless” communications are helping companies work smarter and more efficiently in today’s very fast-paced business world.

MOBILITY TRENDS:

Let’s look now at six (6) mobility trends.

  1. Increasingly Sophisticated Vehicle Communications—There was a time when the only contact a driver had with home base was after an action, such as load drop-off, took place or when there was an in-route problem. Today, as you might expect, truck drivers, pilots and others responsible for getting product to the customer can communicate real-time.  Cell phones have revolutionized and made possible real-time communication.
  2. Trucking Apps—By 2015, Frost & Sullivan indicated the size of the mobile trucking app market hit $35.4 billion dollars. Mobile apps are being launched, targeting logistics almost constantly. With the launch of UBER Freight, the competition in the trucking app space has heated up considerably, pressing incumbents to innovate and move much faster than ever before.
  3. Its’ Not Just for the Big Guys Anymore: At one time, fleet mobility solutions were reserved for larger companies that could afford them.  As technology has advanced and become more mainstream and affordable, so have fleet mobility solution.
  4. Mobility Helps Pinpoint Performance and Productivity Gaps: Knowing where everything is at any one given time is “golden”. It is the Holy Grail for every logistics manager.  Mobility is putting that goal within their reach.
  5. More Data Means More Mobile Technology to Generate and Support Logistics: One great problem that is now being solved, is how to handle perishable goods and refrigerated consumer items.  Shippers who handle these commodities are now using sensors to detect trailer temperatures, dead batteries, and other problems that would impact their cargos.  Using sensors, and the data they generate, shippers can hopefully make much better business decisions and head off problems before they occur.  Sensors, if monitored properly, can indicate trends and predict eventual problems.
  6. Customers Want More Information and Data—They Want It Now: Customer’s expectations for real-time shipment data is now available at their fingertips without having to pick up a telephone or send an e-mail.  Right now, that information is available quickly online or with a smartphone.

CONCLUSIONS: 

The world is changing at light speed, and mobility communications is one technology making this possible.  I have no idea as to where we will be in ten years, but it just might be exciting.


WHERE WE ARE:

The manufacturing industry remains an essential component of the U.S. economy.  In 2016, manufacturing accounted for almost twelve percent (11.7%) of the U.S. gross domestic product (GDP) and contributed slightly over two trillion dollars ($2.18 trillion) to our economy. Every dollar spent in manufacturing adds close to two dollars ($1.81) to the economy because it contributes to development in auxiliary sectors such as logistics, retail, and business services.  I personally think this is a striking number when you compare that contribution to other sectors of our economy.  Interestingly enough, according to recent research, manufacturing could constitute as much as thirty-three percent (33%) of the U.S. GDP if both its entire value chain and production for other sectors are included.  Research from the Bureau of Labor Statistics shows that employment in manufacturing has been trending up since January of 2017. After double-digit gains in the first quarter of 2017, six thousand (6,000) new jobs were added in April.  Currently, the manufacturing industry employs 12,396,000 people, which equals more than nine percent (9%) of the U.S. workforce.   Nonetheless, many experts are concerned that these employment gains are soon to be halted by the ever-rising adoption of automation. Yet automation is inevitable—and like in the previous industrial revolutions, automation is likely to result in job creation in the long term.  If we look back at the Industrial Revolution.

INDUSTRIAL REVOLUTION:

The Industrial Revolution began in the late 18th century when a series of new inventions such as the spinning jenny and steam engine transformed manufacturing in Britain. The changes in British manufacturing spread across Europe and America, replacing traditional rural lifestyles as people migrated to cities in search of work. Men, women and children worked in the new factories operating machines that spun and wove cloth, or made pottery, paper and glass.

Women under 20 made comprised the majority of all factory workers, according to an article on the Industrial Revolution by the Economic History Association. Many power loom workers, and most water frame and spinning jenny workers, were women. However, few women were mule spinners, and male workers sometimes violently resisted attempts to hire women for this position, although some women did work as assistant mule spinners. Many children also worked in the factories and mines, operating the same dangerous equipment as adult workers.  As you might suspect, this was a great departure from times prior to the revolution.

WHERE WE ARE GOING:

In an attempt to create more jobs, the new administration is reassessing free trade agreements, leveraging tariffs on imports, and promising tax incentives to manufacturers to keep their production plants in the U.S. Yet while these measures are certainly making the U.S. more attractive for manufacturers, they’re unlikely to directly increase the number of jobs in the sector. What it will do, however, is free up more capital for manufacturers to invest in automation. This will have the following benefits:

  • Automation will reduce production costs and make U.S. companies more competitive in the global market. High domestic operating costs—in large part due to comparatively high wages—compromise the U.S. manufacturing industry’s position as the world leader. Our main competitor is China, where low-cost production plants currently produce almost eighteen percent (17.6%) of the world’s goods—just zero-point percent (0.6%) less than the U.S. Automation allows manufacturers to reduce labor costs and streamline processes. Lower manufacturing costs results in lower product prices, which in turn will increase demand.

Low-cost production plants in China currently produce 17.6% of the world’s goods—just 0.6% less

than the U.S.

  • Automation increases productivity and improves quality. Smart manufacturing processes that make use of technologies such as robotics, big data, analytics, sensors, and the IoT are faster, safer, more accurate, and more consistent than traditional assembly lines. Robotics provide 24/7 labor, while automated systems perform real-time monitoring of the production process. Irregularities, such as equipment failures or quality glitches, can be immediately addressed. Connected plants use sensors to keep track of inventory and equipment performance, and automatically send orders to suppliers when necessary. All of this combined minimizes downtime, while maximizing output and product quality.
  • Manufacturers will re-invest in innovation and R&D. Cutting-edge technologies. such as robotics, additive manufacturing, and augmented reality (AR) are likely to be widely adopted within a few years. For example, Apple® CEO Tim Cook recently announced the tech giant’s $1 billion investment fund aimed at assisting U.S. companies practicing advanced manufacturing. To remain competitive, manufacturers will have to re-invest a portion of their profits in R&D. An important aspect of innovation will involve determining how to integrate increasingly sophisticated technologies with human functions to create highly effective solutions that support manufacturers’ outcomes.

Technologies such as robotics, additive manufacturing, and augmented reality are likely to be widely adopted soon. To remain competitive, manufacturers will have to re-invest a portion of their profits in R&D.

HOW AUTOMATION WILL AFFECT THE WORKFORCE:

Now, let’s look at the five ways in which automation will affect the workforce.

  • Certain jobs will be eliminated.  By 2025, 3.5 million jobs will be created in manufacturing—yet due to the skills gap, two (2) million will remain unfilled. Certain repetitive jobs, primarily on the assembly line will be eliminated.  This trend is with us right now.  Retraining of employees is imperative.
  • Current jobs will be modified.  In sixty percent (60%) of all occupations, thirty percent (30%) of the tasks can be automated.  For the first time, we hear the word “co-bot”.  Co-bot is robotic assisted manufacturing where an employee works side-by-side with a robotic system.  It’s happening right now.
  • New jobs will be created. There are several ways automation will create new jobs. First, lower operating costs will make U.S. products more affordable, which will result in rising demand. This in turn will increase production volume and create more jobs. Second, while automation can streamline and optimize processes, there are still tasks that haven’t been or can’t be fully automated. Supervision, maintenance, and troubleshooting will all require a human component for the foreseeable future. Third, as more manufacturers adopt new technologies, there’s a growing need to fill new roles such as data scientists and IoT engineers. Fourth, as technology evolves due to practical application, new roles that integrate human skills with technology will be created and quickly become commonplace.
  • There will be a skills gap between eliminated jobs and modified or new roles. Manufacturers should partner with educational institutions that offer vocational training in STEM fields. By offering students on-the-job training, they can foster a skilled and loyal workforce.  Manufacturers need to step up and offer additional job training.  Employees need to step up and accept the training that is being offered.  Survival is dependent upon both.
  • The manufacturing workforce will keep evolving. Manufacturers must invest in talent acquisition and development—both to build expertise in-house and to facilitate continuous innovation.  Ten years ago, would you have heard the words, RFID, Biometrics, Stereolithography, Additive manufacturing?  I don’t think so.  The workforce MUST keep evolving because technology will only improve and become a more-present force on the manufacturing floor.

As always, I welcome your comments.

AN AVERAGE DAY FOR DATA

August 4, 2017


I am sure you have heard the phrase “big data” and possibly wondered just what that terminology relates to.  Let’s get the “official” definition, as follows:

The amount of data that’s being created and stored on a global level is almost inconceivable, and it just keeps growing. That means there’s even more potential to glean key insights from business information – yet only a small percentage of data is actually analyzed. What does that mean for businesses? How can they make better use of the raw information that flows into their organizations every day?

The concept gained momentum in the early 2000s when industry analyst Doug Laney articulated the now-mainstream definition of big data as the four plus complexity:

  • Organizations collect data from a variety of sources, including business transactions, social media and information from sensor or machine-to-machine data. In the past, storing it would’ve been a problem – but new technologies (such as Hadoop) have eased the burden.
  • Data streams in at an unprecedented speed and must be dealt with in a timely manner. RFID tags, sensors and smart metering are driving the need to deal with torrents of data in near-real time.
  • Data comes in all types of formats – from structured, numeric data in traditional databases to unstructured text documents, email, video, audio, stock ticker data and financial transactions.
  • In addition to the increasing velocities and varieties of data, data flows can be highly inconsistent with periodic peaks. Is something trending in social media? Daily, seasonal and event-triggered peak data loads can be challenging to manage. Even more so with unstructured data.
  • Today’s data comes from multiple sources, which makes it difficult to link, match, cleanse and transform data across systems. However, it’s necessary to connect and correlate relationships, hierarchies and multiple data linkages or your data can quickly spiral out of control.

AN AVERAGE DAY IN THE LIFE OF BIG DATA:

I picture is worth a thousand words but let us now quantify, on a daily basis, what we mean by big data.

  • U-Tube’s viewers are watching a billion (1,000,000,000) hours of videos each day.
  • We perform over forty thousand (40,000) searches per second on Google alone. That is approximately three and one-half (3.5) billion searches per day and roughly one point two (1.2) trillion searches per year, world-wide.
  • Five years ago, IBM estimated two point five (2.5) exabytes (2.5 billion gigabytes of data generated every day. It has grown since then.
  • The number of e-mail sent per day is around 269 billion. That is about seventy-four (74) trillion e-mails per year. Globally, the data stored in data centers will quintuple by 2020 to reach 915 exabytes.  This is up 5.3-fold with a compound annual growth rate (CAGR) of forty percent (40%) from 171 exabytes in 2015.
  • On average, an autonomous car will churn out 4 TB of data per day, when factoring in cameras, radar, sonar, GPS and LIDAR. That is just for one hour per day.  Every autonomous car will generate the data equivalent to almost 3,000 people.
  • By 2024, mobile networks will see machine-to-machine (M2M) connections jump ten-fold to 2.3 billion from 250 million in 2014, this is according to Machina Research.
  • The data collected by BMW’s current fleet of 40 prototype autonomous care during a single test session would fill the equivalent stack of CDs 60 miles high.

We have become a world that lives “by the numbers” and I’m not too sure that’s altogether troubling.  At no time in our history have we had access to data that informs, miss-informs, directs, challenges, etc etc as we have at this time.  How we use that data makes all the difference in our daily lives.  I have a great friend named Joe McGuinness. His favorite expressions: “It’s about time we learn to separate the fly s_____t from the pepper.  If we apply this phrase to big data, he may just be correct. Be careful out there.


Portions of the following post were taken from an article by Rob Spiegel publishing through Design News Daily.

Two former Apple design engineers – Anna Katrina Shedletsky and Samuel Weiss have leveraged machine learning to help brand owners improve their manufacturing lines. The company, Instrumental , uses artificial intelligence (AI) to identify and fix problems with the goal of helping clients ship on time. The AI system consists of camera-equipped inspection stations that allow brand owners to remotely manage product lines at their contact manufacturing facilities with the purpose of maximizing up-time, quality and speed. Their digital photo is shown as follows:

Shedletsky and Weiss took what they learned from years of working with Apple contract manufacturers and put it into AI software.

“The experience with Apple opened our eyes to what was possible. We wanted to build artificial intelligence for manufacturing. The technology had been proven in other industries and could be applied to the manufacturing industry,   it’s part of the evolution of what is happening in manufacturing. The product we offer today solves a very specific need, but it also works toward overall intelligence in manufacturing.”

Shedletsky spent six (6) years working at Apple prior to founding Instrumental with fellow Apple alum, Weiss, who serves Instrumental’s CTO (Chief Technical Officer).  The two took their experience in solving manufacturing problems and created the AI fix. “After spending hundreds of days at manufacturers responsible for millions of Apple products, we gained a deep understanding of the inefficiencies in the new-product development process,” said Shedletsky. “There’s no going back, robotics and automation have already changed manufacturing. Intelligence like the kind we are building will change it again. We can radically improve how companies make products.”

There are number examples of big and small companies with problems that prevent them from shipping products on time. Delays are expensive and can cause the loss of a sale. One day of delay at a start-up could cost $10,000 in sales. For a large company, the cost could be millions. “There are hundreds of issues that need to be found and solved. They are difficult and they have to be solved one at a time,” said Shedletsky. “You can get on a plane, go to a factory and look at failure analysis so you can see why you have problems. Or, you can reduce the amount of time needed to identify and fix the problems by analyzing them remotely, using a combo of hardware and software.”

Instrumental combines hardware and software that takes images of each unit at key states of assembly on the line. The system then makes those images remotely searchable and comparable in order for the brand owner to learn and react to assembly line data. Engineers can then take action on issues. “The station goes onto the assembly line in China,” said Shedletsky. “We get the data into the cloud to discover issues the contract manufacturer doesn’t know they have. With the data, you can do failure analysis and reduced the time it takes to find an issue and correct it.”

WHAT IS AI:

Artificial intelligence (AI) is intelligence exhibited by machines.  In computer science, the field of AI research defines itself as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of success at some goal.   Colloquially, the term “artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving”.

As machines become increasingly capable, mental facilities once thought to require intelligence are removed from the definition. For instance, optical character recognition is no longer perceived as an example of “artificial intelligence”, having become a routine technology.  Capabilities currently classified as AI include successfully understanding human speech,  competing at a high level in strategic game systems (such as chess and Go), autonomous cars, intelligent routing in content delivery networks, military simulations, and interpreting complex data.

FUTURE:

Some would have you believe that AI IS the future and we will succumb to the “Rise of the Machines”.  I’m not so melodramatic.  I feel AI has progressed and will progress to the point where great time saving and reduction in labor may be realized.   Anna Katrina Shedletsky and Samuel Weiss realize the potential and feel there will be no going back from this disruptive technology.   Moving AI to the factory floor will produce great benefits to manufacturing and other commercial enterprises.   There is also a significant possibility that job creation will occur as a result.  All is not doom and gloom.

%d bloggers like this: