We all wish for our children and grandchildren the very best education available to them whether it’s public or private.  Local school districts many times struggle with maintaining older schools and providing the upgrades necessary to make and keep schools safe and functional.  There have been tremendous changes to needs demanded by this digital age as well as security so necessary.  Let’s take a look at what The Consulting-Specifying Engineer Magazine tells us they have discovered relative to NEW school trends and designs that fulfill needs of modern-day students.

  • Technology is touching all aspects of modern school systems and is a key component of content display and communication within the classroom. Teachers and students are no longer static within the classroom.  They are very mobile and flexible which creates the necessity for robust, flexible, and in most cases wireless infrastructure that responds to and does not distract from learning.
  • Multiple-purpose use facilities with large central areas which can serve as cafeteria, theater and even gymnasium are key to this trend. Individual classrooms are quickly becoming a thing of the past. The mechanical, electrical and plumbing equipment must be flexible for the many-purposed uses as well as being able to quickly transition from one to the next.
  • SECURITY is an absolute must when considering a new school building. Site access must be limited with movement throughout the building being secure with in-service cameras and a card access.  This must be accomplished without the school looking like a prison.
  • Color tuning, a new word for me, is accomplished by painting and lighting and creates an atmosphere for maximum learning. These efforts facilitate a more natural atmosphere and are more in line with circadian rhythms.  Warmer color temperature paints can increase relaxation and reduce stressful learning.
  • IAQ-Indoor Air Quality. According to the EPA:
    • Fifty percent (50%) of the schools in the U.S. today have issues linked to deficient or failing IAQ.
    • Deficient IAQ increases asthma risk by fifty percent (50%)
    • Test scores can drop by twenty-one percent (21%) with insufficient IAQ.
    • Schools with deficient IAQ have lower average student attendance rates
    • Cleaner indoor air promotes better health for students and teachers.
    • Implementing IAQ management can boost test scores by over fifteen percent (15%)
    • Greater ventilation can reduce absenteeism by ten (10) absences per one thousand students.
  • School administrators and school boards demand facilities that are equipped with sufficient lighting and sufficient fire protection. Heating and air conditioning as well as the electrical systems necessary to drive these pieces of hardware must be energy efficient.  Emergency generators are becoming a basic requirement to facilitate card readers and emergency door access.
  • Voice evacuation fire alarm and performance sound and telecommunication systems must be provided and must be kept active by emergency generators if power failures occur.
  • More and more high schools offer advanced placement generating college credits required for admission to universities and colleges. State-of-the art equipment facilitates this possibility. We are talking about laboratories, compressed air systems, medical and dental equipment, IT facilities, natural gas distribution systems, environment systems supporting biodiesel, solar and wind turbines, and other specialized equipment.  Many schools offer education at night as well as in the daytime.
  • All codes, local, state, federal and international MUST be adhered to with no exceptions.
  • Construction costs account for twenty to forty percent (20-40%) of the total life-cycle costs so maintenance and replacement must be considered when designing facilities.
  • Control systems providing for energy savings during off-peak hours must be designed into school building facilities.
  • LED lighting is becoming a must with dimmable controls, occupancy/vacancy sensors and daylight harvesting is certainly desirable.
  • For schools in the mid-west and other areas of our country, tornado shelters must be considered and certainly could save lives when available.

These are just a few of the requirements architects and design engineers face when quoting a package to school boards and regional school systems.  Much more sophisticated that ever before with requirements never thought of before.  Times are changing—and for the better.


One source for this post is Forbes Magazine article, ” U.S. Dependence on Foreign Oil Hits 30-Year Low”, by Mr. Mike Patton.  Other sources were obviously used.

The United States is at this point in time “energy independent”—for the most part.   Do you remember the ‘70s and how, at times, it was extremely difficult to buy gasoline?  If you were driving during the 1970s, you certainly must remember waiting in line for an hour or more just to put gas in the ol’ car? Thanks to the OPEC oil embargo, petroleum was in short supply. At that time, America’s need for crude oil was soaring while U.S. production was falling. As a result, the U.S. was becoming increasingly dependent on foreign suppliers. Things have changed a great deal since then. Beginning in the mid-2000s, America’s dependence on foreign oil began to decline.  One of the reasons for this decline is the abundance of natural gas or methane existent in the US.

“At the rate of U.S. dry natural gas consumption in 2015 of about 27.3 Tcf (trillion cubic feet) per year, the United States has enough natural gas to last about 86 years. The actual number of years will depend on the amount of natural gas consumed each year, natural gas imports and exports, and additions to natural gas reserves. Jul 25, 2017”

For most of the one hundred and fifty (150) years of U.S. oil and gas production, natural gas has played second fiddle to oil. That appeared to change in the mid-2000s, when natural gas became the star of the shale revolution, and eight of every 10 rigs were chasing gas targets.

But natural gas turned out to be a shooting star. Thanks to the industry’s incredible success in leveraging game-changing technology to commercialize ultralow-permeability reservoirs, the market was looking at a supply glut by 2010, with prices below producer break-even values in many dry gas shale plays.

Everyone knows what happened next. The shale revolution quickly transitioned to crude oil production, and eight of every ten (10) rigs suddenly were drilling liquids. What many in the industry did not realize initially, however, is that tight oil and natural gas liquids plays would yield substantial associated gas volumes. With ongoing, dramatic per-well productivity increases in shale plays, and associated dry gas flowing from liquids resource plays, the beat just keeps going with respect to growth in oil, NGL and natural gas supplies in the United States.

Today’s market conditions certainly are not what had once been envisioned for clean, affordable and reliable natural gas. But producers can rest assured that vision of a vibrant, growing and stable market will become a reality; it just will take more time to materialize. There is no doubt that significant demand growth is coming, driven by increased consumption in industrial plants and natural gas-fired power generation, as well as exports, including growing pipeline exports to Mexico and overseas shipments of liquefied natural gas.

Just over the horizon, the natural gas star is poised to again shine brightly. But in the interim, what happens to the supply/demand equation? This is a critically important question for natural gas producers, midstream companies and end-users alike.

Natural gas production in the lower-48 states has increased from less than fifty (50) billion cubic feet a day (Bcf/d) in 2005 to about 70 Bcf/d today. This is an increase of forty (40%) percent over nine years, or a compound annual growth rate of about four (4%) percent. There is no indication that this rate of increase is slowing. In fact, with continuing improvements in drilling efficiency and effectiveness, natural gas production is forecast to reach almost ninety (90) Bcf/d by 2020, representing another twenty-nine (29%) percent increase over 2014 output.

Most of this production growth is concentrated in a few extremely prolific producing regions. Four of these are in a fairway that runs from the Texas Gulf Coast to North Dakota through the middle section of the country, and encompasses the Eagle Ford, the Permian Basin, the Granite Wash, the SouthCentral Oklahoma Oil Play and other basins in Oklahoma, and the Williston Basin. The other major producing region is the Marcellus and Utica shales in the Northeast. Almost all the natural gas supply growth is coming from these regions.

We are at the point where this abundance can allow US companies to export LNG or liquified natural gas.   To move this cleaner-burning fuel across oceans, natural gas must be converted into liquefied natural gas (LNG), a process called liquefaction. LNG is natural gas that has been cooled to –260° F (–162° C), changing it from a gas into a liquid that is 1/600th of its original volume.  This would be the same requirement for Dayton.  The methane gas captured would need to be liquified and stored.  This is accomplished by transporting in a vessel similar to the one shown below:

As you might expect, a vessel such as this requires very specific designs relative to the containment area.  A cut-a-way is given below to indicate just how exacting that design must be to accomplish, without mishap, the transportation of LNG to other areas of the world.

Loading LNG from storage to the vessel is no easy manner either and requires another significant expenditure of capital.

For this reason, LNG facilities over the world are somewhat limited in number.  The map below will indicate their location.

A typical LNG station, both process and loading may be seen below.  This one is in Darwin.

CONCLUSIONS:

With natural gas being in great supply, there will follow increasing demand over the world for this precious commodity.  We already see automobiles using LNG instead of gasoline as primary fuel.  Also, the cost of LNG is significantly less than gasoline even with average prices over the US being around $2.00 +++ dollars per gallon.  According to AAA, the national average for regular, unleaded gasoline has fallen for thirty-five (35) out of thirty-six (36) days to $2.21 per gallon and sits at the lowest mark for this time of year since 2004. Gas prices continue to drop in most parts of the country due to abundant fuel supplies and declining crude oil costs. Average prices are about fifty-five (55) cents less than a year ago, which is motivating millions of Americans to take advantage of cheap gas by taking long road trips this summer.

I think the bottom line is: natural gas is here to stay.


The island of Puerto Rico has a remarkably long road ahead relative to rebuilding after Maria and Irma.

After Puerto Rico was pummeled by Hurricane Maria two weeks ago, a Category 4 hurricane with 150 mph winds, the island has been left in shambles. After suffering widespread power outages thanks to Irma, one million Puerto Ricans have been left without electricity. Sixty thousand (60,000) still had not gotten power when Maria brought a total, island-wide power outage and severe shortages in food, water, and other supplies.

As of today, October 2, 2017 there is still no power on the island except for a handful of generators powering high-priority buildings like select hospitals.   The island most likely will not return to full power for another six to nine months. This also means that there are close to zero working cell phone towers and no reception anywhere on the island.  Communication is the life-blood of any rebuilding and humanitarian effort and without landlines and cell phones, that effort will become incredibly long and frustrating. The following digital picture will indicate the great lack of communication.

Fuel for generators is running out (though authorities in Puerto Rico insist that it’s a distribution problem, not a shortage). Puerto Ricans are waiting in six-hour lines for fuel, while many stations have run completely dry.

In most of Puerto Rico there is no water – that means no showers, no flushable toilets, and no drinkable water that’s not out of a bottle. In some of the more remote parts of the island, rescue workers are just beginning to arrive.

To indicate just how dire the situation is:  “According to the US Department of Health and Public Services, a superfund site is “any land in the United States that has been contaminated by hazardous waste and identified by the EPA as a candidate for cleanup because it poses a risk to human health and/or the environment.” These sites are put on the National Priorities List (NPL), a list of the most dire cases of environmental contamination in the US and its territories. These are places where a person can’t even walk on the ground and breathe the air without seriously endangering their health.”  That is exactly where PR is at this time.

Puerto Rico’s fallout from Maria and Irma will result in a long, long road to recovery. Even though the island is home to 3.5 million US citizens, help has definitely been delayed compared to response in the US.    The island’s pre-existing poverty and environmentally dangerous Superfund Sites will make rebuilding a tricky and toxic business, costing in the billions of dollars.

We may get better idea at the devastation by looking at the digital satellite pictures below.

A much more dramatic depiction may be seen below.

CONCLUSIONS:

As recently as 2016, the island suffered a three-day, island-wide blackout as a result of a fire. A private energy consultant noted then that the Puerto Rico Electric Power Authority “appears to be running on fumes, and … desperately requires an infusion of capital — monetary, human and intellectual — to restore a functional utility.” Puerto Ricans in early 2016 were suffering power outages at rates four to five times higher than average U.S. customers, said the report from the Massachusetts-based Synapse Energy Economics.  What was a very sad situation even before Maria and Irma, is now a complete disaster.  As I mentioned above—a very long road of recovery for the island.

 

VOLVO ANNOUNCEMENT

July 7, 2017


Certain portions of this post were taken from Mr. Chris Wiltz writing for Design News Daily.

I don’t know if you are familiar with the VOLVO line of automobiles but for years the brand has been known for safety and durability.  My wife drives a 2005 VOLVO S-40 with great satisfaction relative to reliability and cost of maintenance.  The S-40 has about 150,000 miles on the odometer and continues to run like a Singer Sewing Machine.   The “boxy, smoking diesel” VOLVO of years-gone-by has been replaced by a very sleek aerodynamic configuration representing significant improvements in design and styling.  You can take a look at the next two digitals to see where they are inside and out.

As you can see from the JPEG above, the styling is definitely twenty-first century with agreeable slip-stream considerations in mind.

The interior is state-of-the art with all the whistles and bells necessary to attract the most discerning buyer.

Volvo announced this past Tuesday that starting in 2019 it will only make fully electric or hybrid cars.  “This announcement marks the end of the solely combustion engine-powered car,” Håkan Samuelsson, Volvo’s president and chief executive, said in a statement.  The move is a significant bet by the carmaker indicating they feel the age of the internal-combustion engine is quickly coming to an end.  Right now, the Gothenburg, Sweden-based automaker is lone among the world’s major automakers to move so aggressively into electric or hybrid cars. Volvo sold around half a million cars last year, significantly less than the world’s largest car companies such as Toyota, Volkswagen, and GM, but far greater than the 76,000 sold by Tesla, the all-electric carmaker.

Every car it produces from 2019 forward will have an electric motor.   Håkan Samuelsson indicated there has been a clear increase in consumer demand as well as a “commitment towards reducing the carbon footprint thereby contributing to better air quality in our cities.”  The Swedish automaker will cease production of pure internal combustion engine (ICE) vehicles and will not plan any new developments into diesel engines.

The company will begin producing three levels of electric vehicles (mild, Twin Engine, and fully electric) and has committed to commercializing one million Twin Engine or all-electric cars until 2025.   Between 2019 and 2021 Volvo plans to launch five fully electric cars, three of which will be Volvo models and two that will be high performance electric vehicles from Polestar, Volvo’s performance car division. Samuelsson said all of these electric vehicles will be new models and not necessarily new stylings of existing Volvo models.

Technical details on the vehicles were sparse during a press conference held by Volvo, but the company did offer information about its three electric vehicle tiers. The mild electric vehicles, which Volvo views as a stepping stone away from ICEs, will feature a forty-eight (48) volt system featuring a battery in conjunction with a complex system functioning as a starter, generator, and electric motor.   Twin Engine will be a plug-in hybrid system. During the press conference Henrik Green, Senior VP of R&D at Volvo, said the company will be striving to provide a “very competitive range” with these new vehicles, which will be available in medium range and long range – at least up to 500 kilometers (about 311 miles) on a single charge. Green said Volvo has not yet settled on a battery supplier, but said the company is looking at all available suppliers for the best option.  “When it comes to batteries of course it’s a highly competitive and important component in all the future pure battery electric vehicles,” he said. Samuelsson added that this should also be taken as an invitation for more companies to invest in battery research and development. “We need new players and competition in battery manufacturing,” Samuelsson said.

This new announcement represents a dramatic shift in point of view for Volvo. Back in 2014 Samuelsson said the company didn’t believe in all-electric vehicles and said that hybrids were the way forward. Why the change of heart? Samuelsson told the press conference audience that Volvo was initially skeptical about the cost level of batteries and the lack of infrastructure to for recharging electric cars. “Things have moved faster, costumer demand has increased, battery costs have come down and there is movement now in charging infrastructure,” he said.

Top of Form

VOLVO did not unveil any details on vehicle costs. However, earlier reports from the Geneva Motor Show in March quoted Lex Kerssemakers , CEO of Volvo Car USA, as saying that the company’s first all-electric vehicle would have a range of at least 250 miles and price point of between 35,000 and $40,000 when it is released in 2019.

I think this is a fascinating step on the part of VOLVO.  They are placing all of their money on environmental efforts to reduce emissions.  I think that is very commendable.  Hopefully their vision for the future improves their brand and does not harm their sales efforts.


I know I’m spoiled.  I like to know that when I get behind the wheel, put the key in the ignition, start my vehicle, pull out of the driveway, etc. I can get to my destination without mechanical issues.  I think we all are basically there.  Now, to do that, you have to maintain your “ride”.  I have a 1999 Toyota Pre-runner with 308,000 plus miles. Every three thousand miles I have it serviced.  Too much you say?  Well, I do have 308K and it’s still humming like a Singer Sewing Machine.

Mr. Charles Murry has been following the automotive industry for over thirty years.  Mr. Murry is also a senior editor for Design News Daily Magazine.  Much of the information below results from his recent post on the TEN MOST UNRELIABLE VEHICLES.  Each year Consumer Reports receives over one-half million consumer surveys on reliability information relative to the vehicles they drive.  The story is not always not a good one.  Let’s take a look at what CU readers consider the must unreliable vehicles and why.

Please keep in mind this is a CU report based upon feedback from vehicle owners.  Please do not shoot the messenger.  As always, I welcome your comments and hope this help your buying research.

THE NEXT FIVE (5) YEARS

February 15, 2017


As you well know, there are many projections relative to economies, stock market, sports teams, entertainment, politics, technology, etc.   People the world over have given their projections for what might happen in 2017.  The world of computing technology is absolutely no different.  Certain information for this post is taken from the publication “COMPUTER.org/computer” web site.  These guys are pretty good at projections and have been correct multiple times over the past two decades.  They take their information from the IEEE.

The IEEE Computer Society is the world’s leading membership organization dedicated to computer science and technology. Serving more than 60,000 members, the IEEE Computer Society is the trusted information, networking, and career-development source for a global community of technology leaders that includes researchers, educators, software engineers, IT professionals, employers, and students.  In addition to conferences and publishing, the IEEE Computer Society is a leader in professional education and training, and has forged development and provider partnerships with major institutions and corporations internationally. These rich, self-selected, and self-paced programs help companies improve the quality of their technical staff and attract top talent while reducing costs.

With these credentials, you might expect them to be on the cutting edge of computer technology and development and be ahead of the curve as far as computer technology projections.  Let’s take a look.  Some of this absolutely blows me away.

human-brain-interface

This effort first started within the medical profession and is continuing as research progresses.  It’s taken time but after more than a decade of engineering work, researchers at Brown University and a Utah company, Blackrock Microsystems, have commercialized a wireless device that can be attached to a person’s skull and transmit via radio thought commands collected from a brain implant. Blackrock says it will seek clearance for the system from the U.S. Food and Drug Administration, so that the mental remote control can be tested in volunteers, possibly as soon as this year.

The device was developed by a consortium, called BrainGate, which is based at Brown and was among the first to place implants in the brains of paralyzed people and show that electrical signals emitted by neurons inside the cortex could be recorded, then used to steer a wheelchair or direct a robotic arm (see “Implanting Hope”).

A major limit to these provocative experiments has been that patients can only use the prosthetic with the help of a crew of laboratory assistants. The brain signals are collected through a cable screwed into a port on their skull, then fed along wires to a bulky rack of signal processors. “Using this in the home setting is inconceivable or impractical when you are tethered to a bunch of electronics,” says Arto Nurmikko, the Brown professor of engineering who led the design and fabrication of the wireless system.

capabilities-hardware-projection

Unless you have been living in a tree house for the last twenty years you know digital security is a huge problem.  IT professionals and companies writing code will definitely continue working on how to make our digital world more secure.  That is a given.

exascale

We can forget Moor’s Law which refers to an observation made by Intel co-founder Gordon Moore in 1965. He noticed that the number of transistors per square inch on integrated circuits had doubled every year since their invention.  Moore’s law predicts that this trend will continue into the foreseeable future. Although the pace has slowed, the number of transistors per square inch has since doubled approximately every 18 months. This is used as the current definition of Moore’s law.  We are well beyond that with processing speed literally progressing at “warp six”.

non-volitile-memory

If you are an old guy like me, you can remember when computer memory costs an arm and a leg.  Take a look at the JPEG below and you get an idea as to how memory costs has decreased over the years.

hard-drive-cost-per-gbyte

As you can see, costs have dropped remarkably over the years.

photonics

texts-for-photonoics

power-conservative-multicores

text-for-power-conservative-multicores

CONCLUSION:

If you combine the above predictions with 1.) Big Data, 2.) Internet of Things (IoT), 3.) Wearable Technology, 4.) Manufacturing 4.0, 5.) Biometrics, and other fast-moving technologies you have a world in which “only the adventurous thrive”.  If you do not like change, I recommend you enroll in a monastery.  You will not survive gracefully without technology on the rampage. Just a thought.


I want us to consider a “what-if” scenario.  You are thirty-two years old, out of school, and have finally landed a job you really enjoy AND you are actually making money at that job. You have your expenses covered with “traveling money” left over for a little fun.  You recently discovered the possibility that Social Security (SS), when you are ready to retire, will be greatly reduced if not completely eliminated. You MUST start saving for retirement and consider SS to be the icing on the cake if available at all.  QUESTION: Where do you start?  As you investigate the stock markets you find stocks seem to be the best possibility for future income.  Stocks, bonds, “T” bills, etc. all are possibilities but stocks are at the top of the list.

People pay plenty of money for consulting giants to help them figure out which technology trends are fads and which will stick. You could go that route, or get the same thing from the McKinsey Global Institute’s in-house think-tank for the cost of a new book. No Ordinary Disruption: The Four Global Forces Breaking All the Trends, was written by McKinsey directors Richard Dobbs, James Manyika, and Jonathan Woetzel, and offers insight into which developments will have the greatest impact on the business world in coming decades. If you chose stocks, you definitely want to look at technology sectors AND consider companies contributing products to those sectors.  The following list from that book may help.  Let’s take a look.

Below, we’re recapping their list of the “Disruptive Dozen”—the technologies the group believes have the greatest potential to remake today’s business landscape.

Batteries

energy-storage

The book’s authors predict that the price of lithium-ion battery packs could fall by a third in the next 10 years, which will have a big impact on not only electric cars, but renewable energy storage. There will be major repercussions for the transportation, power generation, and the oil and gas industries as batteries grow cheaper and more efficient.  Battery technology will remain with us and will contribute to ever-increasing product offerings as time goes by.  Companies supplying this market sector will only increase in importance.

Genomics

genomics

As super computers make the enormously complicated process of genetic analysis much simpler, the authors foresee a world in which “genomic-based diagnoses and treatments will extend patients’ lives by between six months and two years in 2025.” Sequencing systems could eventually become so commonplace that doctors will have them on their desktops.  This is a rapidly growing field and one that has and will save lives.

Material Science

advanced-materials

The ability to manipulate existing materials on a molecular level has already enabled advances in products like sunglasses, bike frames, and medical equipment. Scientists have greater control than ever over nanomaterials in a variety of substances, and their understanding is growing. Health concerns recently prompted Dunkin’ Donuts to remove nanomaterials from their food. But certain advanced nanomaterials show promise for improving health, and even treating cancer. Coming soon: materials that are self-healing, self-cleaning, and that remember their original shape even if they’re bent.

Self-Driving or Autonomous Automobiles

self-driving-vehicles

Autonomous cars are coming, and fast. By 2025, the “driverless revolution” could already be “well underway,” the authors write. All the more so if laws and regulations in the U.S. can adapt to keep up. Case in point: Some BMW cars already park themselves. You will not catch me in a self-driving automobile unless the FED and the auto maker can assure me they are safe.  Continuous effort is being expended to do just that.  These driverless automobiles are coming and we all may just as well get used to it.

Alternate Energy Solutions

reneuable-energy

Wind and solar have never really been competitive with fossil fuels, but McKinsey predicts that status quo will change thanks to technology that enables wider use and better energy storage. In the last decade, the cost of solar energy has already fallen by a factor of 10, and the International Energy Agency predicts that the sun could surpass fossil fuels to become the world’s largest source of electricity by 2050.  I might include with wind and solar, methane recovery from landfills, biodiesel, compressed natural gas, and other environmentally friendly alternatives.

Robotic Systems

advanced-robotics

The robots are coming! “Sales of industrial robots grew by 170% in just two years between 2009 and 2011,” the authors write, adding that the industry’s annual revenues are expected to exceed $40 billion by 2020. As robots get cheaper, more dexterous, and safer to use, they’ll continue to grow as an appealing substitute for human labor in fields like manufacturing, maintenance, cleaning, and surgery.

3-D Printing

3-d-printing

Much-hyped additive manufacturing has yet to replace traditional manufacturing technologies, but that could change as systems get cheaper and smarter. “In the future, 3D printing could redefine the sale and distribution of physical goods,” the authors say. Think buying an electric blueprint of a shoe, then going home and printing it out. The book notes that “the manufacturing process will ‘democratize’ as consumers and entrepreneurs start to print their own products.”

Mobile Devices

mobile-internet

The explosion of mobile apps has dramatically changed our personal experiences (goodbye hookup bars, hello Tinder), as well as our professional lives. More than two thirds of people on earth have access to a mobile phone, and another two or three billion people are likely to gain access over the coming decade. The result: internet-related expenditures outpace even agriculture and energy, and will only continue to grow.

Artificial Intelligence

automation-of-knowledge

It’s not just manufacturing jobs that will be largely replaced by robots and 3D printers. Dobbs, Manyika, and Woetzel report that by 2025, computers could do the work of 140 million knowledge workers. If Watson can win at “Jeopardy!” there’s nothing stopping computers from excelling at other knowledge work, ranging from legal discovery to sports coverage.

 

The Internet of Things (IoT)

iot

Right now, 99% of physical objects are unconnected to the “internet of things.” It won’t last. Going forward, more products and tools will be controlled via the internet, the McKinsey directors say, and all kinds of data will be generated as a result. Expect sensors to collect information on the health of machinery, the structural integrity of bridges, and even the temperatures in ovens.

Cloud Technology

cloud-technology

The growth of cloud technology will change just how much small businesses and startups can accomplish. Small companies will get “IT capabilities and back-office services that were previously available only to larger firms—and cheaply, too,” the authors write. “Indeed, large companies in almost every field are vulnerable, as start-ups become better equipped, more competitive, and able to reach customers and users everywhere.”

Oil Production

advanced-oil-technology

The International Energy Agency predicts the U.S. will be the world’s largest producer of oil by 2020, thanks to advances in fracking and other technologies, which improved to the point where removing oil from hard-to-reach spots finally made economic sense. McKinsey directors expect increasing ease of fuel extraction to further shift global markets.  This was a real surprise to me but our country has abundant oil supplies and we are already fairly self-sufficient.

Big Data

big-data

There is an ever-increasing accumulation of data from all sources.  At no time in our global history has there been a greater thirst for information.  We count and measure everything now days with the recent election being one example of that very fact.  Those who can control and manage big data are definitely ahead of the game.

CONCLUSION:  It’s a brave new world and a world that accommodates educated individuals.  STAY IN SCHOOL.  Get ready for what’s coming.  The world as we know it will continue to change with greater opportunities as time advances.  Be there.  Also, I would recommend investing in those technology sectors that feed the changes.  I personally don’t think a young investor will go wrong.

INTELLIGENT FLEET SOLUTIONS

October 16, 2016


Ever been on an Interstate?  Ever travel those highways WITHOUT seeing one of the “big rigs”?  I don’t think so. I have a commute every day on Interstate 75 and even at 0530 hours the heavy-duty truck traffic is significant.  As I travel that route, I pass two rest stops dedicated solely for drivers needing to take a break.  They are always full; lights on, engines running. (More about that later.)

Let’s take a very quick look at transportation in the United States to get calibrated as to the scope and breadth of the transportation industry. (NOTE: The following information comes from TruckInfo.net.)

  • How big is the trucking industry?
    The trucking companies, warehouses and private sector in the U.S. employs an estimated 8.9 million people employed in trucking-related jobs; nearly 3.5 million were truck drivers. Of this figure UPS employs 60,000 workers and 9% are owner operators.  LTL shippers account for around 13.6 percent of America’s trucking sector.
  • How many trucks operate in the U.S.?
    Estimates of 15.5 million trucks operate in the U.S.  Of this figure 2 million are tractor trailers.
  • How many truckers are there?
    It is an estimated over 3.5 million truck drivers in the U.S.  Of that one in nine are independent, a majority of which are owner operators. Canada has in excess of 250,000 truck drivers.
  • How many trucking companies are there in the U.S.?
    Estimates of 1.2 million companies in the U.S. Of that figure 97% operate 20 or fewer while 90% operate 6 or fewer trucks.
  • How many miles does the transportation industry transports good in a year?
    In 2006 the transportation industry logged 432.9 billion miles. Class 8 trucks accounted for 139.3 billion of those miles, up from 130.5 billion in 2005
  • What is the volume of goods transported by the trucking industry?
    The United States economy depends on trucks to deliver nearly 70 percent of all freight transported annually in the U.S., accounting for $671 billion worth of manufactured and retail goods transported by truck in the U.S. alone. Add $295 billion in truck trade with Canada and $195.6 billion in truck trade with Mexico.

As you can see, the transportation industry, moving products from point “A” to point “B” by truck, is HUGE—absolutely HUGE.    With this being the case, our country has established goals to improving gas mileage for passenger cars, light trucks and heavy-duty trucks.  These goals are dedicated to improving gas mileage but also goals to reduce emissions.  Let’s take a look.

Passenger Car and Light Truck Standards for 2017 and beyond

In 2012, NHTSA established final passenger car and light truck CAFE standards for model years 2017-2021, which the agency projects will require in model year 2021, on average, a combined fleet-wide fuel economy of 40.3-41.0 mpg. As part of the same rulemaking action, EPA issued GHG standards, which are harmonized with NHTSA’s fuel economy standards that are projected to require 163 grams/mile of carbon dioxide (CO2) in model year 2025.  EPA will reexamine the GHG standards for model years 2022-2025 and NHTSA will set new CAFE standards for those model years in the next couple of years, based on the best available information at that time.

The Big Rigs

On June 19, the U.S. Environmental Protection Agency (EPA) and the Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) announced major increases for fuel efficiency of heavy-duty trucks. Part of President Obama’s comprehensive Climate Action Plan, Phase 2 of the Heavy-Duty National Program tightens emission standards for heavy-duty trucks and includes big rigs, delivery vehicles, dump trucks and buses.  The updated efficiency rule for trucks joins a growing list of fuel efficiency measures, including the President’s 2012 doubling of fuel efficiency standards for cars and light-duty trucks (CAFE standards), as well as expected aircraft rules, following the agency’s finding that aircraft emissions endanger human health.

While the miles per gallon (mpg) rating of cars and light duty trucks has increased over the last decade or so, the fuel efficiency of heavy-duty trucks has held at 5 mpg for over four decades. Conversely, the average passenger vehicle reached 24 mpg in 2010.  Under CAFE, cars and light duty trucks are set to reach 54.5 MPG by 2025. 

According to EPA, heavy-duty trucks are the fastest growing emissions segment of the U.S. transportation sector; they are currently responsible for twenty percent (20%) of greenhouse gas (GHG) emissions, while comprising just four percent (4%) of on-road vehicles.  Heavy duty trucks power the consumer economy, carrying seventy percent (70%) of all U.S. freight – weighing in at 10 billion tons of everything from food to electronics, building materials, clothes and other consumer goods.

As you can see, the goals are not only reduction in fuel usage but improvements in emissions.  There are companies and programs dedicated to meeting these goals.  The reason for this post is to indicate that people and companies are working to provide answers; solving problems; providing value-added to our environment and even our way of life. One such company is Intelligent Fleet Solutions.

The big questions is, how do we meet these goals?  The burden is up to companies manufacturing the engines and design of the cabs and trailers.  Alternate fuels are one answer; i.e. using CNG (compressed natural gas), biofuels, hydrogen, etc. but maybe not the entire answer.

One manner in which these goals may be met is reducing engine idle while trucks are at rest.  The following chart will explain the dilemma and one target for reduction in petroleum consumption.

gas-usage-at-idle

This chart shows petroleum consumption of various vehicles at idle.  Notice: diesel engine consumption can use up to 1.00 gallon per hour when idling.  Question, can we lessen this consumption?

Companies designing and manufacturing devices to contribute to this effort are being introduced helping to drive us towards meeting really tough café goals.  One such company is Intelligent Fleet Solutions. Let’s take a look.

INTELLIGENT FLEET SOLUTIONS

What if the vehicle you drive could automatically alter its performance by doing the following?

  • Governing maximum speed in Class 8 vehicles
  • Optimizing acceleration
  • Providing for a more efficient cruise

If you look carefully at the following brochure you will see a device that provides all three.  The DERIVE program is downloaded into your vehicle’s ECM (Electronic Control Module) allowing control from generic to specific.  You are in control.  The program is contained in a hand-held pendent that “jacks” into the same receptacle used to reset your check engine light.  Heavy-duty trucks may have another port for this pendent but the same process is used.  The great part—the software is quick loading and low cost.  A driver or owner has a payback considerably less one year.  My friend Amy Dobrikova is an approved reseller for DERIVE technologies. Please contact her for further information at 765-617-8614.

derive

derive-2

CONCLUSIONS:  Intelligent Fleet Solutions performs a great service in helping to preserve non-renewable fossil fuels AND lessening or eliminating harmful effluent from our environment.  “Solutions” recognizes the fact that “all hands must be on deck” to solve emission problems and conserve remaining petroleum supplies.  This company embodies the fact that America is still THE country in which technology is applied to solve problems and insure specific goals are met.  Intelligent Fleet Solutions is a great contributor to that effort.  Check them out at intelligent-fleet.com


Our two oldest granddaughters attend Georgia State University in Atlanta, Georgia.  Great school and they have majors that will equip them well after graduation.  (No gender studies, basket weaving or quilting classes with these two.)  We visit them frequently, always enjoying our time together but dreading the commute to Atlanta. Love ‘hotlanta’ but absolutely HATE the congestion and that congestion begins about twenty (20) miles outside the city.   When the Braves, Falcons, Hawks, or Gladiators (Ice Hockey) are in town the congestion is doubled.  Interstate 75 is the main route to most of central Florida so summer-time travel is wonderful also.   You get the picture.

This got me to thinking, what is the monetary cost of travel?  Please note, I said monetary; not the cost of stress on one’s system, physical and mental. Data published in April of this year by the American Transportation Research Institute (ATRI) puts the impact of being stuck in traffic into stark terms with a single data point: traffic congestion on the U.S. National Highway System added over $49.6 billion (yes that’s with a “B”) in operational costs to the trucking industry in 2014. That’s just added shipping costs for trucks delivering goods to clients and customers.  This does not include domestic agony experienced by a family of four trying to get to grandmother’s house for Thanksgiving dinner. The ATRI said congestion resulted in a calculated delay totaling more than 728 million hours of lost productivity, equaling 264,500 commercial truck drivers sitting idle for a working year.  More than a dozen states experienced increased costs of over one billion dollars ($1B) each due to congestion.  Traffic congestion tended to be most severe in urban areas, with eighty-eight percent (88%) of the congestion costs concentrated in only eighteen percent (18%) of the network mileage and ninety-five percent (95%) of the total congestion costs occurring in metropolitan areas.  The analysis also demonstrates the impact of congestion costs on a per-truck basis, with average increased costs of $26,625 for trucks that travel 150,000 miles annually.  At one time, traffic congestion was considered an indicator of growth, but                                                                                   above a certain threshold, congestion starts to become a huge drag on possible growth. Specifically, congestion seems to slow job growth when it gets to be worse than about thirty-five (35) to thirty-seven (37) hours of delay per commuter per year (or about four-and-a-half minutes per one-way trip, relative to free-flowing traffic).  A similar threshold exists when the entire road network gets too saturated throughout the course of the day (for transportation wonks, that’s at about 11,000 ADT per lane).  Above that four-and-a-half-minute threshold, however, something else happens: The quality of life of people making those commutes starts to decline. Now, if you have to spend a miserable hour or two five days a week just getting to work, you’re either going to require higher wages to compensate you, or you’re going to look for another job. And if congestion makes it harder to match the right workers to the best jobs, that’s economically inefficient, too.

When categorizing the delays impacting business, we see the following:

  1. Freight Delivery – market size, vehicle/fleet size, both cross-country and local
  2. Business Scheduling – delivery time shifts, reconfiguration of backhaul operations, use of relief drivers. Using Atlanta as an example, repair and replacement facilities, at one time, could accommodate an average of ten (10) clients per day.  Now, that’s down to six (6) per day due to congestion.  That’s money lost.
  3. Business Operations – inventory management, retail stocking, cross-docking
  4. Intermodal Connection Arrangements – access to truck/rail/air/sea interchange terminals.  Transportation must be scheduled and delays for any reason cost firms for rescheduling.
  5. Worker Travel and Compensation – worker time/cost, schedule reliability, “on-the-clock” work travel
  6. Business Relocation Issues – smaller dispersed location strategies, moves outside of major markets, shifts to production elsewhere
  7. Localized Interactions with Other Activities – land use/development and costs passed on to employees.

Each of these seven classes of business delays affect specific areas of the supply chain.  These systematic differences are important because they vary by industry, affect the ability of affected industries to mitigate congestion costs through work-around operational changes, and ultimately affect local economic competitiveness in different ways.

ENVIRONMENTAL CONCERNS:

Congestion also affects environmental areas. No one will be surprised to learn that areas with the largest number of cars on the road see higher levels of air pollution on average. Motor vehicles are one of the largest sources of pollution worldwide. You may be surprised to learn, however, that slower moving traffic emits more pollution than when cars move at freeway speeds. Traffic jams are bad for our air.  It seems intuitive that your car burns more fuel the faster you go. But the truth is that your car burns the most fuel while accelerating to get up to speed. Maintaining a constant speed against wind-resistance burns more or less a constant amount. It’s when you find yourself in a sea of orange traffic cones — stuck in what looks more like a parking lot than a highway — that your car really starts eating up gas. The constant acceleration and braking of stop-and-go traffic burns more gas, and therefore pumps more pollutants into the air.

The relationship between driving speed and pollution isn’t perfectly linear although one study suggests that emissions start to go up when average freeway speed dips below forty-five (45) miles per hour (mph). They also start to go up dramatically as the average speed goes above 65 mph. So, the “golden zone” for fuel-consumption and emissions from your vehicle may be somewhere between 45 and 65 mph. Stopping and starting in traffic jams burns fuel at a higher rate than smooth rate of travel on the open highway. This increase in fuel consumption costs commuters additional money for fuel and contributes to the amount of emissions released by the vehicles. These emissions create air pollution related to global warming.

This leads to a dilemma for urban planners trying to develop roadways that will reduce congestion with an eye to reducing the pollution that it causes. Laying out the traffic cones for massive freeway expansion projects sends air-quality plummeting, but the hope is that air-quality will improve somewhat once the cones are gone and everyone is cruising along happily at regular freeway speeds. Ironically, since the average freeway speeds for non-congested traffic hover around seventy (70) mph and above (with states like Texas looking to increase their speed limits), air-quality is unlikely to improve — and may actually worsen — once those highway improvements are finished.

ROAD RAGE:

This is horrible but we see news releases everday concerning drivers that just “lose” it.  Eight out of ten drivers surveyed in the AAA Foundation’s annual Traffic Safety culture Index rank aggressive driving as a “serious” or “extremely serious” risk that jeopardizes their safety. Although “road rage” incidents provide some of the most shocking views of aggressive driving, many common behaviors, including racing, tailgating, failing to observe signs and regulations, and seeking confrontations with other drivers, all qualify as potentially aggressive behaviors. Speeding is one of the most prevalent aggressive behaviors. AAA Foundation studies show that speeding is a factor in one-third of all fatal crashes.

Despite a strong public awareness and understanding of aggressive driving, many people are willing to excuse aggressive behaviors.  Half of all drivers in our Traffic Safety Culture Index admitted to exceeding both neighborhood and highway speed limits by more than fifteen percent (15%) in the past thirty (30) days.  More remarkable, a quarter of drivers say they consider speeding acceptable. Much of the road rage we see results from having been in bumper-to-bumper traffic previously.  THAT is a proven fact.

CONCLUSIONS:

Traffic hurts—our economy, our environment, our relationships with family and coworkers, and physical health.  As always, I welcome your comments.

NANOMATERIALS

May 13, 2016


In recent months there has been considerable information regarding nanomaterials and how those materials are providing significant breakthroughs in R&D.  Let’s first define a nanomaterial.

DEFINITION:

“Nanomaterials describe, in principle, materials of which a single unit is sized (in at least one dimension) between 1 and 1000 nanometres (10−9 meter) but is usually 1—100 nm (the usual definition of nanoscale).”

Obviously microscopic in nature but extremely effective when applied properly to a process.  Further descriptions are as follows:

Nanomaterials must include the average particle size, allowing for aggregation or clumping of the individual particles and a description of the particle number size distribution (range from the smallest to the largest particle present in the preparation).

Detailed assessments may include the following:

  1. Physical properties:
  • Size, shape, specific surface area, and ratio of width and height
  • Whether they stick together
  • Number size distribution
  • How smooth or bumpy their surface is
  • Structure, including crystal structure and any crystal defects
  • How well they dissolve
  1. Chemical properties:
  • Molecular structure
  • Composition, including purity, and known impurities or additives
  • Whether it is held in a solid, liquid or gas
  • Surface chemistry
  • Attraction to water molecules or oils and fats

A number of techniques for tracking nanoparticles exist with an ever-increasing number under development. Realistic ways of preparing nanomaterials for test of their possible effects on biological systems are also being developed.

There are nanoparticles such as volcanic ash, soot from forest fires naturally occurring or the incidental byproducts of combustion processes (e.g., welding, diesel engines).  These are usually physically and chemically heterogeneous and often termed ultrafine particles. Engineered nanoparticles are intentionally produced and designed with very specific properties relative to shape, size, surface properties and chemistry. These properties are reflected in aerosols, colloids, or powders. Often, the behavior of nanomaterials may depend more on surface area than particle composition itself. Relative-surface area is one of the principal factors that enhance its reactivity, strength and electrical properties.

Engineered nanoparticles may be bought from commercial vendors or generated via experimental procedures by researchers in the laboratory (e.g., CNTs can be produced by laser ablation, HiPCO  or high-pressure carbon monoxide, arc discharge, and chemical vapor deposition (CVD)). Examples of engineered nanomaterials include: carbon buckeyballs or fullerenes; carbon nanotubes; metal or metal oxide nanoparticles (e.g., gold, titanium dioxide); quantum dots, among many others.

Nanotube

The digital photograph above shows a nanotube, which is a member of the fullerene structural family. (NOTE:  A fullerene is a molecule of carbon in the form of a hollow sphereellipsoidtube, and many other shapes. Spherical fullerenes are also called Buckminsterfullerenes or buckeyballs, which resemble balls used in soccer.  Cylindrical fullerenes are called carbon nanotubes or buckeytubes.  Fullerenes are similar in structure to graphite, which is composed of stacked graphene sheets of linked hexagonal rings. ) Their name is derived from their long, hollow structure with walls formed by one-atom-thick sheets of carbon, called graphene. These sheets are rolled at specific and discrete angles where the combination of the rolling angle and radius defines the nanotube properties; for example, whether the individual nanotube shell is a metal or semiconductor.  Nanotubes are categorized as single-walled nanotubes (SWNTs) or multi-walled nanotubes (MWNTs). Individual nanotubes naturally align themselves into “ropes” held together by van der Waals forces, more specifically, pi-stacking.

The JPEG below shows a nanoplate material.

NANOPLATE

Nanoplate uses nanometer materials and combines them in engineered and industrial coating processes to incorporate new and improved features in the finished product.

USES OF NANO TECHNOLOGY:

Let’s look at today’s uses for nano technology and you can get a good picture as to where the field is going.

  • Stain-repellent Eddie Bauer Nano-CareTM khakis, with surface fibers of 10 to 100 nanometers, uses a process that coats each fiber of fabric with “nano-whiskers.” Developed by Nano-Tex, a Burlington Industries subsidiary. Dockers also makes khakis, a dress shirt and even a tie treated with what they call “Stain Defender”, another example of the same nanoscale cloth treatment.
    Impact: Dry cleaners, detergent and stain-removal makers, carpet and furniture makers, window covering maker.
  • BASF’s annual sales of aqueous polymer dispersion products amount to around $1.65 billion. All of them contain polymer particles ranging from ten to several hundred nanometers in size. Polymer dispersions are found in exterior paints, coatings and adhesives, or are used in the finishing of paper, textiles and leather. Nanotechnology also has applications in the food sector. Many vitamins and their precursors, such as carotinoids, are insoluble in water. However, when skillfully produced and formulated as nanoparticles, these substances can easily be mixed with cold water, and their bioavailability in the human body also increases. Many lemonades and fruit juices contain these specially formulated additives, which often also provide an attractive color. In the cosmetics sector, BASF has for several years been among the leading suppliers of UV absorbers based on nanoparticulate zinc oxide. Incorporated in sun creams, the small particles filter the high-energy radiation out of sunlight. Because of their tiny size, they remain invisible to the naked eye and so the cream is transparent on the skin.
  • Sunscreens are utilizing nanoparticles that are extremely effective at absorbing light, especially in the ultra-violet (UV) range. Due to the particle size, they spread more easily, cover better, and save money since you use less. And they are transparent, unlike traditional screens which are white. These sunscreens are so successful that by 2001 they had captured 60% of the Australian sunscreen market.  Impact: Makers of sunscreen have to convert to using nanoparticles. And other product manufacturers, like packaging makers, will find ways to incorporate them into packages to reduce UV exposure and subsequent spoilage. The $480B packaging and $300B plastics industries will be directly affected.
  • Using aluminum nanoparticles, Argonide has created rocket propellants that burn at double the rate. They also produce copper nanoparticles that are incorporated into automotive lubricant to reduce engine wear.
  • AngstroMedica has produced a nanoparticulate-based synthetic bone. “Human bone is made of a calcium and phosphate composite called Hydroxyapatite. By manipulating calcium and phosphate at the molecular level, we have created a patented material that is identical in structure and composition to natural bone. This novel synthetic bone can be used in areas where the natural bone is damaged or removed, such as in the treatment of fractures and soft tissue injuries.
  • Nanodyne makes a tungsten-carbide-cobalt composite powder (grain size less than 15nm) that is used to make a sintered alloy as hard as diamond, which is in turn used to make cutting tools, drill bits, armor plate, and jet engine parts.
    Impact: Every industry that makes parts or components whose properties must include hardness and durability.
  • Wilson Double Core tennis balls have a nanocomposite coating that keeps it bouncing twice as long as an old-style ball. Made by InMat LLC, this nanocomposite is a mix of butyl rubber, intermingled with nanoclay particles, giving the ball substantially longer shelf life. Impact: Tires are the next logical extension of this technology: it would make them lighter (better milleage) and last longer (better cost performance).
  • Applied Nanotech recently demonstrated a 14″ monochrome display based on electron emission from carbon nanotubes.  Impact: Once the process is perfected, costs will go down, and the high-end market will start being filled. Shortly thereafter, and hand-in-hand with the predictable drop in price of CNTs, production economies-of-scale will enable the costs to drop further still, at which time we will see nanotube-based screens in use everywhere CRTs and view screens are used today.
  • China’s largest coal company (Shenhua Group) has licensed technology from Hydrocarbon Technologies that will enable it to liquefy coal and turn it into gas. The process uses a gel-based nanoscale catalyst, which improves the efficiency and reduces the cost.  Impact: “If the technology lives up to its promise and can economically transform coal into diesel fuel and gasoline, coal-rich countries such as the U.S., China and Germany could depend far less on imported oil. At the same time, acid-rain pollution would be reduced because the liquefaction strips coal of harmful sulfur.”

CONCLUSION:

I’m sure the audience I attract will get the significance of nanotechnology and the existing uses in today’s commercial markets.  This is a growing technology and one in which significant R&D effort is being applied.  I think the words are “STAND BY” there is more to come in the immediate future.

 

%d bloggers like this: