MONEY AND BANK SAFETY

November 1, 2017


Do you ever wonder if the money, hard-earned money, you earn every week or month is safe?

According to the FDIC:  “The basic FDIC coverage is good for up to $250,000 per depositor per bank. If you have more than that in a failed bank, the FDIC might choose to cover your losses, but there is no promise to do so.” Sep 13, 2016

The Federal Deposit Insurance Corporation (FDIC) preserves and promotes public confidence in the U.S. financial system by insuring deposits in banks and thrift institutions for up to $250,000 per depositor, per insured bank, for each ownership category by identifying, monitoring and addressing risks to the deposit.  This is the law.  Good to know.

The following list will indicate that our banking system has experienced some “hard times” in the recent past.  Let’s take a look at bank failures in this country and then we will look at the safest countries relative to bank and customer money.

BANK FAILURES:

The following list is taken from the web site: Bankrate.com.

YEAR                      NUMBER OF BANK FAILURES

2016(Estimated)                              1

2015(Estimated)                              8

2014(Estimated)                            18

2013(Estimated)                            14

2012(Estimated)                            51

2011(Official)                                 92

2010(Official)                                157

2019(Official)                                140

As you can see, from 2009 through 2016 there have been four hundred and ninety-one (491) bank failures in this country.

Now, The Survey of Consumer Finances is conducted and published every three years, most recently in 2013. According to the Federal Reserve, “the survey data include information on families’ balance sheets, pensions, income, and demographic characteristics.” Data from previous SCF years show significant changes in checking account balances since 2001. Our analysis of average savings account balances based on the same data can be found as follows:

YEAR     AVERAGE CHECKING BALANCE

2013                       $9,132

2010                       $7,036

2007                       $6,203

2004                       $7,382

2001                       $6,404

As you can see, most people are definitely covered if and when their individual bank fails.  That begs the question:  what are the safest countries in which to deposit money?  Let’s take a look. Some may be very surprising.

SAFEST COUNTRIES IN WHICH TO BANK:

  1. Czech Republic — The Czech banking sector is unusual in that foreign-owned lenders dominate the industry, but consumers don’t seem to mind, ranking them the 14th safest in the world.
  2. Guatemala — The densely populated Central American nation of 15.5 million people has three key players in its banking system — Banco Industrial, Banco G&T Continental, and Banco de Desarrollo Rural. All three are seen as being fairly sound, according to the WEF’s survey.
  3. Luxembourg — It’s no surprise Luxembourg scores highly, as the country is famous for its financial sector. Its Banque et Caisse d’Épargne de l’État is often cited as one of the safest on earth.
  4. Panama — As the country has no central bank, Panamanian lenders are run conservatively, with capital ratios almost twice the required minimum on average. Traditionally seen as a tax haven, the country has made substantial strides to shake off that reputation since the financial crisis.
  5. Sweden — Although Swedish lenders are being squeezed by the Riksbank’s negative interest rate policy, Swedish banks are still among the safest in the world, according to the WEF.
  6. Chile — In July, ratings agency Fitch cut the outlook of the country’s banking system to negative, based on “weakening asset quality and profitability,” but that hasn’t spooked Chileans, according to the WEF.
  7. Singapore — Singapore is renowned as one of the world’s great financial centres, and the soundness of its banking sector reflects that.
  8. Norway — As an oil-reliant economy, Norway has faced serious issues in recent years, and in August, its banking system had its outlook cut to negative by Moody’s. However, the country’s banks remain very sound, the WEF’s survey suggests.
  9. Hong Kong — Another global financial centre, Hong Kong is home to arms of most of the world’s biggest banks, and some of the world’s safest financial institutions.
  10. Australia — A small group of four major banks divide up most of Australia’s banking sector, while foreign banks are tightly regulated, making sure the system is sturdy.
  11. New Zealand — New Zealand’s banking sector is dominated by a group of five financial players. Decent profits and growth without too much competition has seen the sector thrive, although it slips from second last year to fourth in 2016.
  12. Canada — Canadian banks have long been a byword for stability. The country has had only two small regional bank failures in almost 100 years, and had zero failures during the Great Depression of the 1930s. Last year, the country’s banks were seen as the safest on earth, so confidence has clearly slipped a little.
  13. South Africa — South Africa’s so-called ‘Big Four’ — Standard Bank, FirstRand Bank, Nedbank, and Barclays Africa — dominate the country’s consumer sector, and are widely seen to be pretty safe, with only one other nation scoring higher.
  14. Finland — Finland’s banking sector is dominated by co-operative and savings banks, which take little risk. The country’s central bank governor, Erkki Liikanen, below, has led the way on proposals to split investment banking and deposit-taking​ activities at European lenders. Ranked fourth in 2015’s list, Finland’s banks have got even safer this year.

According to the same company that made the list above, the United States ranked number thirty-sixth (36) in depositor safety.

CONCLUSIONS:

I’m definitely not saying run out tomorrow and transfer all of your money to a bank located in one of these countries above but really, can’t we do better as a country?  Can’t the FED just get out of the way?  Regulations and banking philosophy are to blame for the failures given above—not to mention plain OLE GREED.  REMEMBER WELLS-FARGO?

 

Advertisements

Astrolabe

October 25, 2017


Information for the following post was taken from an article entitled “It’s Official: Earliest Known Marine Astrolabe Found in Shipwreck” by Laura Geggel, senior writer for LiveScience, 25 October 2017.

It’s amazing to me how much history is yet to be discovered, understood and transmitted to readers such as you and me.   I read a fascinating article some months ago indicating the history we do NOT know far exceeds the history we DO know.  Of course, the “winners” get to write their version of what happened.  This is as it has always been. In the great and grand scheme of things, we have artifacts and mentifacts.

ARTIFACT:

“Any object made by human beings, especially with a view to subsequent use.  A handmade object, as a tool, or the remains of one, as shard of pottery, characteristic of an earlier time or cultural stage, especially such an object found at an archaeological excavation.”

MENTIFACT:

“Mentifact (sometimes called a “psychofact”) is a term coined by Sir Julian Sorell Huxley, used together with the related terms “sociofact” and “artifact” to describe how cultural traits, such as “beliefs, values, ideas,” take on a life of their own spanning over generations, and are conceivable as objects in themselves.”

The word astrolabe is defined as follows:

The astrolabe is a very ancient astronomical computer for solving problem relating to time and position of the Sun and stars.  Several types of astrolabes have been made.  By far, the most popular type is the planispheric astrolabe, on which the celestial sphere is projected onto the plane of the equator.  A typical old astrolabe was made of brass and was approximately six (6) inches in diameter, although much larger and smaller astrolabes were also fabricated.

The subject for this post is the device shown as follows:

FIND:

More than 500 years ago, a fierce storm sank a ship carrying the earliest known marine astrolabe — a device that helped sailors navigate at sea, new research finds. Divers found the artifact in 2014, but were unsure exactly what it was at the time. Now, thanks to a 3D-imaging scanner, scientists were able to find etchings on the bronze disc that confirmed it was an astrolabe.

“It was fantastic to apply our 3D scanning technology to such an exciting project and help with the identification of such a rare and fascinating item,” Mark Williams, a professorial fellow at the Warwick Manufacturing Group at the University of Warwick, in the United Kingdom, said in a statement. Williams and his team did the scan.

 

The marine astrolabe likely dates to between 1495 and 1500, and was aboard a ship known as the Esmeralda, which sank in 1503. The Esmeralda was part of a fleet led by Portuguese explorer Vasco da Gama, the first known person to sail directly from Europe to India.

In 2014, an expedition led by Blue Water Recoveries excavated the Esmeralda shipwreck and recovered the astrolabe. But because researchers couldn’t discern any navigational markings on the almost seven (7) inch-diameter (17.5 centimeters) disc, they were cautious about labeling it without further evidence.

Now, the new scan reveals etchings around the edge of the disc, each separated by five degrees, Williams found. This detail proves it’s an astrolabe, as these markings would have helped mariners measure the height of the sun above the horizon at noon — a strategy that helped them figure out their location while at sea, Williams said.  The disc is also engraved with the Portuguese coat of arms and the personal emblem of Dom Manuel I, Portugal’s king from 1495 to1521.  “Usually we are working on engineering-related challenges, so to be able to take our expertise and transfer that to something totally different and so historically significant was a really interesting opportunity,” Williams said.

CONCLUSIONS:

The only manner in which the use of this device could be known is by three-dimensional scanning techniques.  Once again, modern technology allows for the unveiling of the truth.  The engravings indicating Portugal’s king nailed the time period.  This is a significant find and confirms early voyages throughout history.

DISTRACTIONS

October 18, 2017


Is there anyone in the United States who does NOT use our road systems on a daily basis?  Only senior citizens in medical facilities and those unfortunate enough to have health problems stay off the roads.  I have a daily commute of approximately thirty-seven (37) miles, one way, and you would not believe what I see.  Then again, maybe you would.  You’ve been there, done that, got the “T” shirt.

It’s no surprise to learn that information systems cause driver distraction, but recent news from the AAA Foundation for Traffic Safety indicated the problem may be worse than we thought. A study released by the organization showed that the majority of today’s information technologies are complex, frustrating, and maybe even dangerous to use. Working with researchers from the University of Utah, AAA analyzed the systems in thirty (30) vehicles, rating them on how much visual and cognitive demand they placed on drivers. The conclusion: None of the thirty-produced low demand. Twenty-three (23) of the systems generated “high” or “very high” demand.

“Removing eyes from the road for just two seconds doubles the risk for a crash,” AAA wrote in a press release. “With one in three adults using the systems available while driving, AAA cautions that using these technologies while behind the wheel can have dangerous consequences.”

In the study, University of Utah researchers examined visual (eyes-on-the-road) and cognitive (mental) demands of each system, and looked at the time required to complete tasks. Tasks included the use of voice commands and touch screens to make calls, send texts, tune the radio and program navigation. And the results were uniformly disappointing—really disappointing.

We are going to look at the twelve (12) vehicles categorized by researchers as having “very high demand” information systems. The vehicles vary from entry-level to luxury and sedan to SUV, but they all share one common trait: AAA says the systems distract drivers.  This is to me very discouraging.  Here we go.

CONCLUSIONS:

I’m definitely NOT saying don’t buy these cars but it is worth knowing and compensating for when driving.

AUGMENTED REALITY (AR)

October 13, 2017


Depending on the location, you can ask just about anybody to give a definition of Virtual Reality (VR) and they will take a stab at it. This is because gaming and the entertainment segments of our population have used VR as a new tool to promote games such as SuperHot VR, Rock Band VR, House of the Dying Sun, Minecraft VR, Robo Recall, and others.  If you ask them about Augmented Reality or AR they probably will give you the definition of VR or nothing at all.

Augmented reality, sometimes called Mixed Reality, is a technology that merges real-world objects or the environment with virtual elements generated by sensory input devices for sound, video, graphics, or GPS data.  Unlike VR, which completely replaces the real world with a virtual world, AR operates in real time and is interactive with objects found in the environment, providing an overlaid virtual display over the real one.

While popularized by gaming, AR technology has shown a prowess for bringing an interactive digital world into a person’s perceived real world, where the digital aspect can reveal more information about a real-world object that is seen in reality.  This is basically what AR strives to do.  We are going to take a look at several very real applications of AR to indicate the possibilities of this technology.

  • Augmented Reality has found a home in healthcare aiding preventative measures for professionals to receive information relative to the status of patients. Healthcare giant Cigna recently launched a program called BioBall that uses Microsoft HoloLense technology in an interactive game to test for blood pressure and body mass index or BMI. Patients hold a light, medium-sized ball in their hands in a one-minute race to capture all the images that flash on the screen in front of them. The Bio Ball senses a player’s heartbeat. At the University of Maryland’s Augmentarium virtual and augmented reality laboratory, the school is using AR I healthcare to improve how ultrasound is administered to a patient.  Physicians wearing an AR device can look at both a patient and the ultrasound device while images flash on the “hood” of the AR device itself.
  • AR is opening up new methods to teach young children a variety of subjects they might not be interested in learning or, in some cases, help those who have trouble in class catching up with their peers. The University of Helsinki’s AR program helps struggling kids learn science by enabling them to virtually interact with the molecule movement in gases, gravity, sound waves, and airplane wind physics.   AR creates new types of learning possibilities by transporting “old knowledge” into a new format.
  • Projection-based AR is emerging as a new way to case virtual elements in the real world without the use of bulky headgear or glasses. That is why AR is becoming a very popular alternative for use in the office or during meetings. Startups such as Lampix and Lightform are working on projection-based augmented reality for use in the boardroom, retail displays, hospitality rooms, digital signage, and other applications.
  • In Germany, a company called FleetBoard is in the development phase for application software that tracks logistics for truck drivers to help with the long series of pre-departure checks before setting off cross-country or for local deliveries. The Fleet Board Vehicle Lense app uses a smartphone and software to provide live image recognition to identify the truck’s number plate.  The relevant information is super-imposed in AR, thus speeding up the pre-departure process.
  • Last winter, Delft University of Technology in the Netherlands started working with first responders in using AR as a tool in crime scene investigation. The handheld AR system allows on-scene investigators and remote forensic teams to minimize the potential for site contamination.  This could be extremely helpful in finding traces of DNA, preserving evidence, and getting medical help from an outside source.
  • Sandia National Laboratories is working with AR as a tool to improve security training for users who are protecting vulnerable areas such as nuclear weapons or nuclear materials. The physical security training helps guide users through real-world examples such as theft or sabotage in order to be better prepared when an event takes place.  The training can be accomplished remotely and cheaply using standalone AR headsets.
  • In Finland, the VTT Technical Research Center recently developed an AR tool for the European Space Agency (ESA) for astronauts to perform real-time equipment monitoring in space. AR prepares astronauts with in-depth practice by coordinating the activities with experts in a mixed-reality situation.
  • The U.S. Daqri International uses computer vision for industrial AR to enable data visualization while working on machinery or in a warehouse. These glasses and headsets from Daqri display project data, tasks that need to be completed and potential problems with machinery or even where an object needs to be placed or repaired.

CONCLUSIONS:

Augmented Reality merges real-world objects with virtual elements generated by sensory input devices to provide great advantages to the user.  No longer is gaming and entertainment the sole objective of its use.  This brings to life a “new normal” for professionals seeking more and better technology to provide solutions to real-world problems.

DEGREE OR NO DEGREE

October 7, 2017


The availability of information in books (as always), on the Internet, through seminars and professional shows, scientific publications, pod-casts, Webinars, etc. is amazing in today’s “digital age”.  That begs the question—Is a college degree really necessary?   Can you rise to a level of competence and succeed by being self-taught?  For most, a college degree is the way to open doors. For a precious few, however, no help is needed.

Let’s look at twelve (12) individuals who did just that.

The co-founder of Apple and the force behind the iPod, iPhone, and iPad, Steve Jobs attended Reed College, an academically-rigorous liberal arts college with a heavy emphasis on social sciences and literature. Shortly after enrolling in 1972, however, he dropped out and took a job as a technician at Atari.

Legendary industrialist Howard Hughes is often said to have graduated from Cal Tech, but the truth is that the California school has no record of his having attended classes there. He did enroll at Rice University in Texas in 1924, but dropped out prematurely due the death of his father.

Arguably Harvard’s most famous dropout, Bill Gates was already an accomplished software programmer when he started as a freshman at the Massachusetts campus in 1973. His passion for software actually began before high school, at the Lakeside School in Seattle, Washington, where he was programming in BASIC by age 13.

Just like his fellow Microsoft co-founder Bill Gates, Paul Allen was a college dropout.

Like Gates, he was also a star student (a perfect score on the SAT) who honed his programming skills at the Lakeside School in Seattle. Unlike Gates, however, he went on to study at Washington State University before leaving in his second year to work as a programmer at Honeywell in Boston.

Even for his time, Thomas Edison had little formal education. His schooling didn’t start until age eight, and then only lasted a few months.

Edison said that he learned most of his reading, writing, and math at home from his mother. Still, he became known as one of America’s most prolific inventors, amassing 1,093 U.S. patents and changing the world with such devices as the phonograph, fluoroscope, stock ticker, motion picture camera, mechanical vote recorder, and long-lasting incandescent electric light bulb. He is also credited with patenting a system of electrical power distribution for homes, businesses, and factories.

Michael Dell, founder of Dell Computer Corp., seemed destined for a career in the computer industry long before he dropped out of the University of Texas. He purchased his first calculator at age seven, applied to take a high school equivalency exam at age eight, and performed his first computer teardown at age 15.

A pioneer of early television technology, Philo T. Farnsworth was a brilliant student who dropped out of Brigham Young University after the death of his father, according to Biography.com.

Although born in a log cabin, Farnsworth quickly grasped technical concepts, sketching out his revolutionary idea for a television vacuum tube while still in high school, much to the confusion of teachers and fellow students.

Credited with inventing the controls that made fixed-wing powered flight possible, the Wright Brothers had little formal education.

Neither attended college, but they gained technical knowledge from their experiences working with printing presses, bicycles, and motors. By doing so, they were able to develop a three-axis controller, which served as the means to steer and maintain the equilibrium of an aircraft.

Stanford Ovshinsky managed to amass 400 patents covering subjects ranging from nickel-metal hydride batteries to amorphous silicon semiconductors to hydrogen fuel cells, all without the benefit of a college education. He is best known for his formation of Energy Conversion Devices and his pioneering work in nickel-metal hydride batteries, which have been widely used in hybrid and electric cars, as well as laptop computers, digital cameras, and cell phones.

Preston Tucker, designer of the infamous 1948 Tucker sedan, worked as a machinist, police officer and car salesman, but was not known to have attended college. Still, he managed to become founder of the Tucker Aviation Corp. and the Tucker Corp.

Larry Ellison dropped out of his pre-med studies at the University of Illinois in his second year and left the University of Chicago after only one term, but his brief academic experiences eventually led him to the top of the computer industry.

A Harvard dropout, Mark Zuckerberg was considered a prodigy before he even set foot on campus.

He began doing BASIC programming in middle school, created an instant messaging system while in high school, and learned to read and write French, Hebrew, Latin, and ancient Greek prior to enrolling in college.

CONCLUSIONS:

In conclusions, I want to leave you with a quote from President Calvin Coolidge:

Nothing in this world can take the place of persistence. Talent will not: nothing is more common than unsuccessful men with talent. Genius will not; unrewarded genius is almost a proverb. Education will not: the world is full of educated derelicts. Persistence and determination alone are omnipotent.

AMAZING GRACE

October 3, 2017


There are many people responsible for the revolutionary development and commercialization of the modern-day computer.  Just a few of those names are given below.  Many of whom you probably have never heard of.  Let’s take a look.

COMPUTER REVOLUNTARIES:

  • Howard Aiken–Aiken was the original conceptual designer behind the Harvard Mark I computer in 1944.
  • Grace Murray Hopper–Hopper coined the term “debugging” in 1947 after removing an actual moth from a computer. Her ideas about machine-independent programming led to the development of COBOL, one of the first modern programming languages. On top of it all, the Navy destroyer USS Hopper is named after her.
  • Ken Thompson and David Ritchie–These guys invented Unix in 1969, the importance of which CANNOT be overstated. Consider this: your fancy Apple computer relies almost entirely on their work.
  • Doug and Gary Carlson–This team of brothers co-founded Brøderbund Software, a successful gaming company that operated from 1980-1999. In that time, they were responsible for churning out or marketing revolutionary computer games like Myst and Prince of Persia, helping bring computing into the mainstream.
  • Ken and Roberta Williams–This husband and wife team founded On-Line Systems in 1979, which later became Sierra Online. The company was a leader in producing graphical adventure games throughout the advent of personal computing.
  • Seymour Cray–Cray was a supercomputer architect whose computers were the fastest in the world for many decades. He set the standard for modern supercomputing.
  • Marvin Minsky–Minsky was a professor at MIT and oversaw the AI Lab, a hotspot of hacker activity, where he let prominent programmers like Richard Stallman run free. Were it not for his open-mindedness, programming skill, and ability to recognize that important things were taking place, the AI Lab wouldn’t be remembered as the talent incubator that it is.
  • Bob Albrecht–He founded the People’s Computer Company and developed a sincere passion for encouraging children to get involved with computing. He’s responsible for ushering in innumerable new young programmers and is one of the first modern technology evangelists.
  • Steve Dompier–At a time when computer speech was just barely being realized, Dompier made his computer sing. It was a trick he unveiled at the first meeting of the Homebrew Computer Club in 1975.
  • John McCarthy–McCarthy invented Lisp, the second-oldest high-level programming language that’s still in use to this day. He’s also responsible for bringing mathematical logic into the world of artificial intelligence — letting computers “think” by way of math.
  • Doug Engelbart–Engelbart is most noted for inventing the computer mouse in the mid-1960s, but he’s made numerous other contributions to the computing world. He created early GUIs and was even a member of the team that developed the now-ubiquitous hypertext.
  • Ivan Sutherland–Sutherland received the prestigious Turing Award in 1988 for inventing Sketchpad, the predecessor to the type of graphical user interfaces we use every day on our own computers.
  • Tim Paterson–He wrote QDOS, an operating system that he sold to Bill Gates in 1980. Gates rebranded it as MS-DOS, selling it to the point that it became the most widely-used operating system of the day. (How ‘bout them apples.?)
  • Dan Bricklin–He’s “The Father of the Spreadsheet. “Working in 1979 with Bob Frankston, he created VisiCalc, a predecessor to Microsoft Excel. It was the killer app of the time — people were buying computers just to run VisiCalc.
  • Bob Kahn and Vint Cerf–Prolific internet pioneers, these two teamed up to build the Transmission Control Protocol and the Internet Protocol, better known as TCP/IP. These are the fundamental communication technologies at the heart of the Internet.
  • Nicklus Wirth–Wirth designed several programming languages, but is best known for creating Pascal. He won a Turing Award in 1984 for “developing a sequence of innovative computer languages.”

ADMIREL GRACE MURRAY HOPPER:

At this point, I want to highlight Admiral Grace Murray Hopper or “amazing Grace” as she is called in the computer world and the United States Navy.  Admiral Hopper’s picture is shown below.

Born in New York City in 1906, Grace Hopper joined the U.S. Navy during World War II and was assigned to program the Mark I computer. She continued to work in computing after the war, leading the team that created the first computer language compiler, which led to the popular COBOL language. She resumed active naval service at the age of 60, becoming a rear admiral before retiring in 1986. Hopper died in Virginia in 1992.

Born Grace Brewster Murray in New York City on December 9, 1906, Grace Hopper studied math and physics at Vassar College. After graduating from Vassar in 1928, she proceeded to Yale University, where, in 1930, she received a master’s degree in mathematics. That same year, she married Vincent Foster Hopper, becoming Grace Hopper (a name that she kept even after the couple’s 1945 divorce). Starting in 1931, Hopper began teaching at Vassar while also continuing to study at Yale, where she earned a Ph.D. in mathematics in 1934—becoming one of the first few women to earn such a degree.

After the war, Hopper remained with the Navy as a reserve officer. As a research fellow at Harvard, she worked with the Mark II and Mark III computers. She was at Harvard when a moth was found to have shorted out the Mark II, and is sometimes given credit for the invention of the term “computer bug”—though she didn’t actually author the term, she did help popularize it.

Hopper retired from the Naval Reserve in 1966, but her pioneering computer work meant that she was recalled to active duty—at the age of 60—to tackle standardizing communication between different computer languages. She would remain with the Navy for 19 years. When she retired in 1986, at age 79, she was a rear admiral as well as the oldest serving officer in the service.

Saying that she would be “bored stiff” if she stopped working entirely, Hopper took another job post-retirement and stayed in the computer industry for several more years. She was awarded the National Medal of Technology in 1991—becoming the first female individual recipient of the honor. At the age of 85, she died in Arlington, Virginia, on January 1, 1992. She was laid to rest in the Arlington National Cemetery.

CONCLUSIONS:

In 1997, the guided missile destroyer, USS Hopper, was commissioned by the Navy in San Francisco. In 2004, the University of Missouri has honored Hopper with a computer museum on their campus, dubbed “Grace’s Place.” On display are early computers and computer components to educator visitors on the evolution of the technology. In addition to her programming accomplishments, Hopper’s legacy includes encouraging young people to learn how to program. The Grace Hopper Celebration of Women in Computing Conference is a technical conference that encourages women to become part of the world of computing, while the Association for Computing Machinery offers a Grace Murray Hopper Award. Additionally, on her birthday in 2013, Hopper was remembered with a “Google Doodle.”

In 2016, Hopper was posthumously honored with the Presidential Medal of Freedom by Barack Obama.

Who said women could not “do” STEM (Science, Technology, Engineering and Mathematics)?

HACKED OFF

October 2, 2017


Portions of this post are taken from an article by Rob Spiegel of Design News Daily.

You can now anonymously hire a cybercriminal online for as little as six to ten dollars ($6 to $10) per hour, says Rodney Joffe, senior vice president at Neustar, a cybersecurity company. As it becomes easier to engineer such attacks, with costs falling, more businesses are getting targeted. About thirty-two (32) percent of information technology professionals surveyed said DDoS attacks cost their companies $100,000 an hour or more. That percentage is up from thirty (30) percent reported in 2014, according to Neustar’s survey of over 500 high-level IT professionals. The data was released Monday.

Hackers are costing consumers and companies between $375 and $575 billion, annually, according to a study published this past Monday, a number only expected to grow as online information stealing expands with increased Internet use.  This number blows my mind.   I actually had no idea the costs were so great.  Great and increasing.

Online crime is estimated at 0.8 percent of worldwide GDP, with developed countries in regions including North America and Europe losing more than countries in Latin American or Africa, according to the new study published by the Center for Strategic and International Studies and funded by cybersecurity firm McAfee.

That amount rivals the amount of worldwide GDP – 0.9 percent – that is spent on managing the narcotics trade. This difference in costs for developed nations may be due to better accounting or transparency in developed nations, as the cost of online crime can be difficult to measure and some companies do not do disclose when they are hacked for fear of damage to their reputations, the report said.

Cyber attacks have changed in recent years. Gone are the days when relatively benign bedroom hackers entered organizations to show off their skills.  No longer is it a guy in the basement of his or her mom’s home eating Doritos.  Attackers now are often sophisticated criminals who target employees who have access to the organization’s jewels. Instead of using blunt force, these savvy criminals use age-old human fallibility to con unwitting employees into handing over the keys to the vault.  Professional criminals like the crime opportunities they’ve found on the internet. It’s far less dangerous than slinging guns. Cybersecurity is getting worse. Criminal gangs have discovered they can carry out crime more effectively over the internet, and there’s less chance of getting caught.   Hacking individual employees is often the easiest way into a company.  One of the cheapest and most effective ways to target an organization is to target its people. Attackers use psychological tricks that have been used throughout mankind.   Using the internet, con tricks can be carried out on a large scale. The criminals do reconnaissance to find out about targets over email. Then they effectively take advantage of key human traits.

One common attack comes as an email impersonating a CEO or supplier. The email looks like it came from your boss or a regular supplier, but it’s actually targeted to a specific professional in the organization.   The email might say, ‘We’ve acquire a new organization. We need to pay them. We need the company’s bank details, and we need to keep this quiet so it won’t affect our stock price.’ The email will go on to say, ‘We only trust you, and you need to do this immediately.’ The email comes from a criminal, using triggers like flattery, saying, ‘You’re the most trusted individual in the organization.’ The criminals play on authority and create the panic of time pressure. Believe it or not, my consulting company has gotten these messages. The most recent being a hack from Experian.

Even long-term attacks can be launched by using this tactic of a CEO message. “A company in Malaysia received kits purporting to come from the CEO.  The users were told the kit needed to be installed. It took months before the company found out it didn’t come from the CEO at all.

Instead of increased technology, some of the new hackers are deploying the classic con moves, playing against personal foibles. They are taking advantage of those base aspects of human nature and how we’re taught to behave.   We have to make sure we have better awareness. For cybersecurity to be engaging, you have to have an impact.

As well as entering the email stream, hackers are identifying the personal interests of victims on social media. Every kind of media is used for attacks. Social media is used to carry out reconnaissance, to identify targets and learn about them.  Users need to see what attackers can find out about them on Twitter or Facebook. The trick hackers use is to pretend they know the target. Then the get closes through personal interaction on social media. You can look at an organization on Twitter and see who works in finance. Then they take a good look across social platform to find those individuals on social media to see if they go to a class each week or if they traveled to Iceland in 1996.  You can put together a spear-phishing program where you say, Hey I went on this trip with you.

CONCLUSIONS:

The counter-action to personal hacking is education and awareness. The company can identify potential weaknesses and potential targets and then change the vulnerable aspects of the corporate environment.  We have to look at the culture of the organization. Those who are under pressure are targets. They don’t have time to study each email they get. We also have to discourage reliance on email.   Hackers also exploit the culture of fear, where people are punished for their mistakes. Those are the people most in danger. We need to create a culture where if someone makes a mistake, they can immediately come forward. The quicker someone comes forward, the quicker we can deal with it.

%d bloggers like this: