May 21, 2016
My wife and I went to a party this afternoon—an outdoor party given by a company devoted to fitness. They wanted to show their appreciation for allowing their clients to beat them up several times each week. (We even pay for them doing this. Go figure.) Great party and it made me realize what a marvelous country we live in. There is room on top of room if you happen to be in the right “neck of the woods”. We traveled only thirty-five (35) minutes to Jasper Highlands, Tennessee to enjoy the day and say hello to our friends. The location was on the top of Jasper Mountain. Take a look.
This is looking West from the top of the Highlands.
Looking South from the Highlands.
It got me to thinking: Just how big are we in this country?
Together, the forty-eight (48) contiguous states and Washington, D.C. occupy a combined area of 3,119,884.69 square miles (8,080,464.3 km2), which is 1.58% of the total surface area of Earth. Of this area, 2,959,064.44 square miles (7,663,941.7 km2) is land, composing 83.65% of U.S. land area, similar to the area of Australia. Officially, 160,820.25 square miles (416,522.5 km2) is water area, composing 62.66% of the nation’s total water area.
The contiguous United States would be placed 5th in the list of countries and dependencies by area; the total area of the country, including Alaska and Hawaii, ranks fourth. Brazil is the only country that is larger in total area than the contiguous United States, but smaller than the entire United States, while Russia, Canada and China are the only three countries larger than both. The 2010 census population of this area was 306,675,006, comprising 99.33% of the nation’s population, and a density of 103.639 inhabitants/sq mi (40.015/km2), compared to 87.264/sq mi (33.692/km2) for the nation as a whole.
If we just look at Alaska, we see the following:
According to an October 1998 report by the United States Bureau of Land Management, approximately sixty-five percent (65%) of Alaska is owned and managed by the U.S. federal government as public lands, including a multitude of national forests, national parks, and national wildlife refuges. Of these, the Bureau of Land Management manages 87 million acres (35 million hectares), or 23.8% of the state. The Arctic National Wildlife Refuge is managed by the United States Fish and Wildlife Service. It is the world’s largest wildlife refuge, comprising 16 million acres (6.5 million hectares).
Of the remaining land area, the state of Alaska owns 101 million acres (41 million hectares), its entitlement under the Alaska Statehood Act. A portion of that acreage is occasionally ceded to organized boroughs, under the statutory provisions pertaining to newly formed boroughs. Smaller portions are set aside for rural subdivisions and other homesteading-related opportunities. These are not very popular due to the often remote and roadless locations. The University of Alaska, as a land grant university, also owns substantial acreage which it manages independently.
Another forty-four (44) million acres (18 million hectares) are owned by 12 regional, and scores of local, Native corporations created under the Alaska Native Claims Settlement Act (ANCSA) of 1971. Regional Native corporation Doyon, Limited often promotes itself as the largest private landowner in Alaska in advertisements and other communications. Provisions of ANCSA allowing the corporations’ land holdings to be sold on the open market starting in 1991 were repealed before they could take effect. Effectively, the corporations hold title (including subsurface title in many cases, a privilege denied to individual Alaskans) but cannot sell the land. Individual Native allotments can be and are sold on the open market, however.
Various private interests own the remaining land, totaling about one percent of the state. Alaska is, by a large margin, the state with the smallest percentage of private land ownership when Native corporation holdings are excluded.
To get an idea as to just how big Alaska is, take a look at the map below.
OK, now let’s look at our biggest state within the contiguous United States—Texas.
Texas is the second largest U.S. state, behind Alaska, with an area of 268,820 square miles (696,200 km2). Though ten percent (10%) larger than France and almost twice as large as Germany or Japan, it ranks only 27th worldwide among country subdivisions by size. If it were still an independent country, Texas would be the 40th largest behind Chile and Zambia.
Now if you really want to talk about the wide open spaces, take a look at the area around Telluride, Colorado. You would think enough room for the entire nation.
We are a vast country with something to satisfy every taste. You can travel to Manhattan where the population density puts you right on top of everyone else or Alaska where you nearest neighbor may be twenty miles away.
May 21, 2016
Once a month a group of guys and I get together for lunch. Great friends needing to solve the world’s problems. (Here lately, it’s taken much longer than the one and one-half hours we spend during our meeting.) One of our friends, call him Joe, just underwent surgery for prostate cancer. This is called a Prostatectomy and is done every day. His description of the “event” was fascinating. To begin with, the surgeon was about twenty (20) feet from the operating table. Yes, that’s correct; the entire surgery was accomplished via robotic systems. OK, why is this procedure more desirable than the “standard” procedure”? The robotic-assisted approach is less invasive, reduces bleeding and offers large 3-D views of the operating fields. The mechanical arms for the robotic system are controlled by the surgeon and provide greater precision than the human hand. This allows the surgeon more control when separating nerves and muscles from the prostate. This benefits patients by lowering the risk of side effects, such as erectile dysfunction and incontinence, while also completely removing cancer tissue. The equipment looks very similar, if not identical to the one given in the JPEG below. Let’s take a look.
As you can see, the electromechanical devices are remarkably sophisticated and represent significant advantages in medical technology. The equipment you are seeing above is called the “patient side cart”. It looks as follows:
During a robotic prostatectomy, the patient side cart is positioned next to the operating table. The system you see above is a da Vinci robotic arm arranged to provide entry points into the human body and prostate. EndoWrist instruments, and the da Vinci Insite Vision System, are mounted onto the robot’s electromechanical arms representing the surgeon’s left and right hands. They provide the functionality to perform complex tissue manipulation through the entry points, or ports. EndoWrist instruments include forceps, scissors, electrocautery, scalpels and other surgical tools. If the surgeon needs to change an Endowrist instrument, common during robotic prostatectomy, the instrument is withdrawn from the surgical system using controls at the console. Typically, an operating room nurse standing near the patient physically removes the EndoWrist instruments and replaces them with new instruments.
There are certainly other types of surgery performed today using robotic systems. Several of these are as follows:
- Cardiac Surgery
- Colorectal Surgery
- General Surgery
- Gynecologic Surgery
- Head & Neck Surgery
- Thoracic Surgery
- Urologic Surgery
One electromechanical device that helps to make this remarkable procedure possible is called an encoder. Let’s define an encoder.
An encoder is a sensor of mechanical motion that generates digital signals in response to motion. As an electro-mechanical device, an encoder is able to provide motion control system users with information concerning position, velocity and direction. There are two different types of encoders: linear and rotary. A linear encoder responds to motion along a path, while a rotary encoder responds to rotational motion. An encoder is generally categorized by the means of its output. An incremental encoder generates a train of pulses which can be used to determine position and speed. An absolute encoder generates unique bit configurations to track positions directly.
As you might expect, knowing the exact position of a medical device used during surgery is absolutely critical to the outcome. The surgeon MUST know the angular position of the device at all times to ensure no errors are made. Nerves, tendons and muscles must be left intact. This information is provided by encoders and encoder data systems.
Linear and rotary encoders are broken down into two main types: the absolute encoder and the incremental encoder. The construction of these two types of encoders is quite similar; however they differ in physical properties and the interpretation of movement.
Incremental rotary encoders utilize a transparent disk which contains opaque sections that are equally spaced to determine movement. A light emitting diode is used to pass through the glass disk and is detected by a photo detector. This causes the encoder to generate a train of equally spaced pulses as it rotates. The output of incremental rotary encoders is measured in pulses per revolution which is used to keep track of position or determine speed. This type of encoder is required with the medical system given above.
Absolute encoders utilize stationary mask in between the photodetector and the encoder disk as shown below. The output signal generated from an absolute encoder is in digital bits which correspond to a unique position. The bit configuration is produced by the light which is received by the photodetector when the disk rotates. The light configuration received is translated into gray code. As a result, each position has its own unique bit configuration.
Typical construction for a rotary encoder is given as follows:
Please note the following features:
- Electrical connection to the right of the encoder body.
- Encoder shaft that couples to the medical device.
- Electrical specifications indicating the device is driven by a five (5) volt +/- 5% source.
You can see from the above illustrated parts breakdown that a rotary encoder is quite technical in design.
System accuracy is critical, especially during surgery. Let’s look.
An encoder’s performance is typically stated as resolution, rather than accuracy of measurement. The encoder may be able to resolve movement into precise bits very accurately, but the accuracy of each bit is limited by the quality of the machine motion being monitored. For example, if there are deflections of machine elements under load, or if there is a drive screw with 0.1 inch of play, using a 1000 count-per-turn encoder with an output reading to 0.001 inch will not improve the 0.1 inch tolerance on the measurement. The encoder only reports position; it cannot improve on the basic accuracy of the shaft motion from which the position is sensed. As you can see, the best encoders, hopefully those used in a surgical device, can deliver accuracy to 0.10 inch. Remarkable accuracy for a robotic device and absolutely necessary.
TECHNOLOGY DELIVERS. Ours lives are much better served with advancing technology and certainly technology applied to the medical profession. This is the reason engineers and technologists endure the rigor necessary to achieve talents that ultimately will be directed to solving problems and advancing technology you have seen from the post above.
As always, I welcome your comments. firstname.lastname@example.org
May 13, 2016
I have never presented to you a “re-blog” but the one written by Meagan Parrish below is, in my opinion, extremely important. We all know the manufacturing sector has really taken a hit in the past few years due to the following issues and conditions:
- Off-shoring or moving manufacturing operations to LCCs (low cost countries). Mexico, China, South Korea and other countries in the Pacific Rim have had an impact on jobs here in the United States.
- Productivity gains in manufacturing. The ability of a manufacturer to economize and simply “do it better” requires fewer direct and indirect employees.
- Robotic systems and automation of the factory floor has created a reduced need for hands-on assembly and production. This trend will only continue as IoT (Internet of Things) becomes more and more prominent.
- Obvious forces reducing jobs in American manufacturing has been the growth in China’s economy and its exports of a large variety of cheap manufactured goods (which are a great boon to American and other consumers). Since China did not become a major player in world markets until after 1990, exports from China cannot explain the downward trend in manufacturing employment prior to that year, but Chinese exports were important in the declining trends in manufacturing during the past 20 years. More than three-fourths of all U.S. traded goods are manufactured products, so goods trade most directly affects manufacturing output. Thus, increases in net exports (the trade balance) increase the demand for manufactured products, and increases in net imports (the trade deficit) reduce the demand for manufactured goods. The U.S. has run a goods trade deficit in every year since 1974 (U.S. Census Bureau 2015).
- The recession cut jobs in all sectors of the American economy, but especially in factories and construction.
- Manufacturers need fewer unskilled workers to perform rote tasks, but more highly skilled workers to operate the machines that automated those tasks. Manufacturers have substituted brains for brawn.
- Trade Negotiations have to some degree left the United States on a non-level playing field. We simply have not negotiated producing results in our best interest.
Manufacturing employment as a fraction of total employment has been declining for the past half century in the United States and the great majority of other developed countries. A 1968 book about developments in the American economy by Victor Fuchs was already entitled The Service Economy. Although the absolute number of jobs in American manufacturing was rather constant at about 17 million from 1969 to 2002, manufacturing’s share of jobs continued to decline from about 28% in 1962 to only 9% in 2011.
Concern about manufacturing jobs has become magnified as a result of the sharp drop in the absolute number of jobs since 2002. Much of this decline occurred prior to the start of the Great Recession in 2008, but many more manufacturing jobs disappeared rapidly during the recession. Employment in manufacturing has already picked up some from its trough as the American economy experiences modest economic growth, and this employment will pick up more when growth accelerates.
As a result of the drop in manufacturing, many of our workers are on welfare as demonstrated by the following post written by Ms. Meagan Parrish. Let’s take a brief look at her resume. The post will follow.
MEAGAN PARRISH BIO:
Meagan Parrish kicked off her career at Advantage Business Media as Chem.Info’s intrepid editor in December 2014. Prior to this role, she spent 12 years working in the journalism biz, including a four-and-a-half year stint as the managing editor of BRAVA, a regional magazine based in Madison, Wis. Meagan graduated from UW-Madison with a degree in international relations and spent a year working toward a master’s in international public policy. She has a strong interest in all things global — including energy, economics, politics and history. As a news junkie, she thinks it’s an exciting time to be working in the world of chemical manufacturing.
Study: One-Third Of Manufacturing Workers Use Welfare Assistance
There was a time when factory jobs lifted millions U.S. workers out of poverty. But according to new data, today’s wages aren’t even enough to support the lives of 1 in 3 manufacturing employees.
The study, conducted by the University of California, Berkeley, found that about one-third of manufacturing workers seek government assistance in the form of food stamps, healthcare subsidies, tax credits for the poor or other forms of welfare to offset low wages.
This amounts to about 2 million workers, and between 2009 and 2013, the cost for assisting these workers added up to $10.2 billion per year.
What’s more, the amount of employees on assistance shoots up 50 percent when temporary workers are included. In fact, the use of temp workers, who can be paid less and offered limited benefits, is one of the main reasons why the overall wages picture looks bleak for manufacturing.
“In decades past, production workers employed in manufacturing earned wages significantly higher than the U.S. average, but by 2013 the typical manufacturing production worker made 7.7 percent below the median wage for all occupations,” said Ken Jacobs, chair of the UC Berkeley Center for Labor Research and Education, in the paper.
“The reality is the production jobs are increasingly coming to resemble fast-food or Wal-Mart jobs,” Jacobs said.
By comparison, the number of fast-food workers who rely on public assistance is about 52 percent.
Oregon was named as the state that has the highest number of factory workers using food stamps, while Mississippi and Illinois lead the country in states needing healthcare assistance. When all forms of government subsidies were factored in, the states with the most manufacturing workers needing help were Mississippi, Georgia, California and Texas.
The research found that the median wage for non-supervisory manufacturing jobs was $15.66 in 2013, while one-fourth of the workers were making $11.91, and many more make less.
A CNBC report on the study detailed the struggles of a single mom working as an assembler at a Detroit Chassis plant in Ohio for $9.50 an hour. She often doesn’t get full 40-hour work weeks and said she has to rely on food stamps, Medicaid and other government programs.
“I absolutely hate being on public assistance,” she said. “You constantly have people judging you.”
The report comes as debate about the minimum wage heats up in the presidential race. Raising the federal minimum wage to $15 has been a chief platform issue for Democratic presidential hopeful, Bernie Sanders. Presumptive Republican candidate Donald Trump has also shown support for lifting wages to some degree.
The findings have also added a sour note to recent good news about jobs in the U.S. Recently, the White House was boasting about improvements in the economy and cited a government report showing that about 232,000 new positions were created during the past 12 months.
CONCLUSIONS: MY THOUGHTS
To me this statistic is shameful. We are talking about the “working poor”. Honest people who cannot provide for their families on the wages they earn or with the skill-sets they have. Please note, I’m not proposing a raise in the minimum wage. I honestly feel that must be left to individual states and companies within each state to make that judgment. I feel the following areas must be addressed by the next president:
- Revamp the corporate and individual tax code. What we have is an abomination!
- Review ALL trade agreements made over the past twenty (20) years. Let’s level the playing field if at all possible.
- Eliminate red tape producing huge barriers to individuals wishing to start companies. When it comes to North American or Western European manufacturing, there are certainly more regulatory barriers to entry.
- Review all regulations, yes environmental also, that block productive commerce.
- Overbearing regulations can give too much power to a few, and potentially corrupt ruling regime and prevent innovative ideas from flourishing. It can perhaps be an obstacle for a foreign nation to invest in a country due to those conditions and regulations which increase costs. (The fact that some of these regulations are usually for the benefit for the people of that nation poses another problem.
- We have a huge skills gap in this country. Skills needed to drive high-tech companies and process MUST be improved. This is an immediate need.
- Beijing signaled with its currency devaluationthat the domestic economic slowdown it has failed to reverse is no longer a problem confined within China’s borders. It is now the world’s problem, too. This problem must be addressed by the next administration.
- Companies need to review their labor policies and do so quickly and with fairness. I’m of the opinion that people are almost universally the best judges of their own welfare, and should generally see to their own welfare (including continuing skill improvement and education), but I’m not in any way opposed to market based loans and even some limited amount of public funding for re-education of indigent non-productive workers (although charity & private sources would be a first choice for me).
As always, I welcome your comments.
May 13, 2016
In recent months there has been considerable information regarding nanomaterials and how those materials are providing significant breakthroughs in R&D. Let’s first define a nanomaterial.
“Nanomaterials describe, in principle, materials of which a single unit is sized (in at least one dimension) between 1 and 1000 nanometres (10−9 meter) but is usually 1—100 nm (the usual definition of nanoscale).”
Obviously microscopic in nature but extremely effective when applied properly to a process. Further descriptions are as follows:
Nanomaterials must include the average particle size, allowing for aggregation or clumping of the individual particles and a description of the particle number size distribution (range from the smallest to the largest particle present in the preparation).
Detailed assessments may include the following:
- Physical properties:
- Size, shape, specific surface area, and ratio of width and height
- Whether they stick together
- Number size distribution
- How smooth or bumpy their surface is
- Structure, including crystal structure and any crystal defects
- How well they dissolve
- Chemical properties:
- Molecular structure
- Composition, including purity, and known impurities or additives
- Whether it is held in a solid, liquid or gas
- Surface chemistry
- Attraction to water molecules or oils and fats
A number of techniques for tracking nanoparticles exist with an ever-increasing number under development. Realistic ways of preparing nanomaterials for test of their possible effects on biological systems are also being developed.
There are nanoparticles such as volcanic ash, soot from forest fires naturally occurring or the incidental byproducts of combustion processes (e.g., welding, diesel engines). These are usually physically and chemically heterogeneous and often termed ultrafine particles. Engineered nanoparticles are intentionally produced and designed with very specific properties relative to shape, size, surface properties and chemistry. These properties are reflected in aerosols, colloids, or powders. Often, the behavior of nanomaterials may depend more on surface area than particle composition itself. Relative-surface area is one of the principal factors that enhance its reactivity, strength and electrical properties.
Engineered nanoparticles may be bought from commercial vendors or generated via experimental procedures by researchers in the laboratory (e.g., CNTs can be produced by laser ablation, HiPCO or high-pressure carbon monoxide, arc discharge, and chemical vapor deposition (CVD)). Examples of engineered nanomaterials include: carbon buckeyballs or fullerenes; carbon nanotubes; metal or metal oxide nanoparticles (e.g., gold, titanium dioxide); quantum dots, among many others.
The digital photograph above shows a nanotube, which is a member of the fullerene structural family. (NOTE: A fullerene is a molecule of carbon in the form of a hollow sphere, ellipsoid, tube, and many other shapes. Spherical fullerenes are also called Buckminsterfullerenes or buckeyballs, which resemble balls used in soccer. Cylindrical fullerenes are called carbon nanotubes or buckeytubes. Fullerenes are similar in structure to graphite, which is composed of stacked graphene sheets of linked hexagonal rings. ) Their name is derived from their long, hollow structure with walls formed by one-atom-thick sheets of carbon, called graphene. These sheets are rolled at specific and discrete angles where the combination of the rolling angle and radius defines the nanotube properties; for example, whether the individual nanotube shell is a metal or semiconductor. Nanotubes are categorized as single-walled nanotubes (SWNTs) or multi-walled nanotubes (MWNTs). Individual nanotubes naturally align themselves into “ropes” held together by van der Waals forces, more specifically, pi-stacking.
The JPEG below shows a nanoplate material.
Nanoplate uses nanometer materials and combines them in engineered and industrial coating processes to incorporate new and improved features in the finished product.
USES OF NANO TECHNOLOGY:
Let’s look at today’s uses for nano technology and you can get a good picture as to where the field is going.
- Stain-repellent Eddie Bauer Nano-CareTM khakis, with surface fibers of 10 to 100 nanometers, uses a process that coats each fiber of fabric with “nano-whiskers.” Developed by Nano-Tex, a Burlington Industries subsidiary. Dockers also makes khakis, a dress shirt and even a tie treated with what they call “Stain Defender”, another example of the same nanoscale cloth treatment.
Impact: Dry cleaners, detergent and stain-removal makers, carpet and furniture makers, window covering maker.
- BASF’s annual sales of aqueous polymer dispersion products amount to around $1.65 billion. All of them contain polymer particles ranging from ten to several hundred nanometers in size. Polymer dispersions are found in exterior paints, coatings and adhesives, or are used in the finishing of paper, textiles and leather. Nanotechnology also has applications in the food sector. Many vitamins and their precursors, such as carotinoids, are insoluble in water. However, when skillfully produced and formulated as nanoparticles, these substances can easily be mixed with cold water, and their bioavailability in the human body also increases. Many lemonades and fruit juices contain these specially formulated additives, which often also provide an attractive color. In the cosmetics sector, BASF has for several years been among the leading suppliers of UV absorbers based on nanoparticulate zinc oxide. Incorporated in sun creams, the small particles filter the high-energy radiation out of sunlight. Because of their tiny size, they remain invisible to the naked eye and so the cream is transparent on the skin.
- Sunscreens are utilizing nanoparticles that are extremely effective at absorbing light, especially in the ultra-violet (UV) range. Due to the particle size, they spread more easily, cover better, and save money since you use less. And they are transparent, unlike traditional screens which are white. These sunscreens are so successful that by 2001 they had captured 60% of the Australian sunscreen market. Impact: Makers of sunscreen have to convert to using nanoparticles. And other product manufacturers, like packaging makers, will find ways to incorporate them into packages to reduce UV exposure and subsequent spoilage. The $480B packaging and $300B plastics industries will be directly affected.
- Using aluminum nanoparticles, Argonide has created rocket propellants that burn at double the rate. They also produce copper nanoparticles that are incorporated into automotive lubricant to reduce engine wear.
- AngstroMedica has produced a nanoparticulate-based synthetic bone. “Human bone is made of a calcium and phosphate composite called Hydroxyapatite. By manipulating calcium and phosphate at the molecular level, we have created a patented material that is identical in structure and composition to natural bone. This novel synthetic bone can be used in areas where the natural bone is damaged or removed, such as in the treatment of fractures and soft tissue injuries.
- Nanodyne makes a tungsten-carbide-cobalt composite powder (grain size less than 15nm) that is used to make a sintered alloy as hard as diamond, which is in turn used to make cutting tools, drill bits, armor plate, and jet engine parts.
Impact: Every industry that makes parts or components whose properties must include hardness and durability.
- Wilson Double Core tennis balls have a nanocomposite coating that keeps it bouncing twice as long as an old-style ball. Made by InMat LLC, this nanocomposite is a mix of butyl rubber, intermingled with nanoclay particles, giving the ball substantially longer shelf life. Impact: Tires are the next logical extension of this technology: it would make them lighter (better milleage) and last longer (better cost performance).
- Applied Nanotech recently demonstrated a 14″ monochrome display based on electron emission from carbon nanotubes. Impact: Once the process is perfected, costs will go down, and the high-end market will start being filled. Shortly thereafter, and hand-in-hand with the predictable drop in price of CNTs, production economies-of-scale will enable the costs to drop further still, at which time we will see nanotube-based screens in use everywhere CRTs and view screens are used today.
- China’s largest coal company (Shenhua Group) has licensed technology from Hydrocarbon Technologies that will enable it to liquefy coal and turn it into gas. The process uses a gel-based nanoscale catalyst, which improves the efficiency and reduces the cost. Impact: “If the technology lives up to its promise and can economically transform coal into diesel fuel and gasoline, coal-rich countries such as the U.S., China and Germany could depend far less on imported oil. At the same time, acid-rain pollution would be reduced because the liquefaction strips coal of harmful sulfur.”
I’m sure the audience I attract will get the significance of nanotechnology and the existing uses in today’s commercial markets. This is a growing technology and one in which significant R&D effort is being applied. I think the words are “STAND BY” there is more to come in the immediate future.
May 11, 2016
I think the most enduring and beneficial technology is evolutionary and not necessarily revolutionary.
The concept of “additive” manufacturing, specifically Selective Laser Sintering (SLS), began in a humble fashion. Carl Deckard and Joe Beaman, a professor at the University of Texas, Austin, began work in 1989 while Deckard was working on his Master’s Degree and later on his PhD. Today, “additive” manufacturing is a multi-million dollar business with immense possibilities.
Henry Ford’s model “T” came long before the sleek Lamborghini.
Wilber and Orville struggled for years to design, produce and fly their bi-wing marvel. The evolutionary result is the Lockheed/Martin F-35, the Lockheed/Martin F-22 Raptor, the Boeing F/A-18 Super Hornet, the Boeing 777, the Airbus 380 and the Boeing 787 Dreamliner.
A newly employed engineer for Texas Instruments (TI) named Jack Kirby recorded his initial idea for integrated circuits in July of 1958. The concept was successfully demonstrated on 12 September 1958. Kirby won the Nobel Prize in Physics in 2000. Rest is history.
Tetris, Wii, Minecraft, Super Mario Brothers had their start in October 1958 when a physicist named William Higinbotham created what is thought to be the first video game. It was a very simple tennis game similar to the classic 1970 came of Pong.
You get the picture—you know where I’m going. Technology is, for the most part, a process that evolves as need arises. I want to take a look at a fascinating, new technology now being called “Hyperloop”.
The Hyperloop is a conceptual high-speed transportation system originally put forward by entrepreneur Elon Musk. The concept incorporates reduced-pressure tubes in which pressurized capsules ride on an air cushion driven by linear induction motors and air compressors. If you look at the digital photograph below you will see the proposed speed is around 760 miles per hour. (Faster than a 57 Chevy!) Please note also the comparison in miles per hour with other transportation systems. The only faster passenger mode of transportation is the now-retired Concord.
The Hyperloop is a very high speed, inter-city transportation system conveying passengers and cargo with a yearly projected target capacity of fifteen (15) million passengers. Mr. Musk envisioned the system as an alternative to the California High-Speed Rail project, thus taking direct aim at the California plan for a sixty-nine (69) billion dollar high-speed train. Musk said the Hyperloop system would cost merely six ($6) billion and move people between San Francisco and Los Angeles in about half an hour rather than three hours.
A picture of the passenger pod is given as follows:
THE PASSENGER POD:
The climate controlled capsule travels inside of a reinforced ‘tube’ pathway, rendering the Hyperloop Transportation System weather independent and earthquake safe thanks to the use of pylons.
The futuristic transit system would consist of low-pressure steel tubes with aluminum capsules or pods supported on a cushion of air. The tubes, which would be outfitted with solar panels for power, would be built on elevated tracks alongside Interstate 5 in California. The entire structure would be elevated as much as one hundred feet above intended routes.
The concept is further demonstrated with the digitals that follow.
From late 2012 until August 2013, an informal group of engineers at both Tesla and SpaceX worked on the conceptual foundation and modeling of Hyperloop, allocating full-time effort toward the end. An early design for the system was then published in a white paper posted to the Tesla and SpaceX blogs. The permanent team is shown in the JPEG below. As you can see, the team is now in place and working to test the theories and operating principals.
In December 2015, the company announced plans to begin testing on an open-air track in Nevada beginning in January 2016, with hopes of reaching speeds of 700 mph (1,100 km/h) by the end of the year. Hyperloop Technologies or HT, procured fifty (50) acres of land and fabricated tube sections in order to build a test track in the Nevada desert. The test track is approximately 0.62 mi (1 km). The initial testing explores the ability of the company’s linear electric motor to accelerate the test vehicle to 335 mph (539 km/h). Thereafter the company plans to construct a full-scale 1.9 mi (3 km) test track where levitated pods will pass through low-friction tubes. The first test was very successful and occurred on 10 May 2016. In other words, today.
First, let’s talk about air. If you travel quickly, air piles up in front of you. The faster you go, the more the air piles up in front and the more resistance develops. This means you have to push even harder. And it’s not what we physicists call a “linear effect”. The faster you go, the worse it is. Bumping up your speed from 10 MPH to 20 MPH doesn’t take nearly as much effort as bumping it up from 110 MPH to 120 MPH. It’s why railway cars like the ones on the Shinkansen in Japan are so streamlined: to help the air flow over them and reduce how much piles up in front.
The second problem you get with high-speed transport is friction between you and the road, where “road” can be an actual road or rails or cushiony magnetic field. Steel wheels on rails produce a lot of friction and heating. Maglev trains get around that by having the trains float on a magnetic field. There are magnets in the track and magnets in the train that repel each other.
The biggest issues are speed and scale. The Hyperloop was pitched as faster as and cheaper than alternatives like cars and trains, but even small shifts in those numbers can dramatically change how it stacks up. It’s easy to imagine safety concerns limiting Hyperloop speeds to just a fraction of its theoretical top speed or right-of-way issues keeping stations far from urban centers. Would we still be excited about the Hyperloop if a 30-minute trek became a three-hour one? What if it cost $60 billion instead the promised $6 billion? After enough setbacks, it might not be worth developing the technology at all. Those deployment details are life-or-death issues for the Hyperloop, but as long as the tests are focused on small-scale loops, it’s not clear we’ll ever get answers to them.
Some feel the biggest hurdle isn’t the tech behind Hyperloop; it’s the land rights and every other bureaucratic obstacle that goes along with building enormous infrastructure projects. I personally feel this may be the biggest problem—red tape associated with the project. The actual placement of the tubes and the route itself could be in the courts for years, maybe decades. I’m sure there would need to be environmental impact studies associated with selecting the route and this could tie the project to the state and Federal government. The Fed is basically non-functioning at this time so delays should and must be expected. This is the country we live in.
HT is very aggressive and has proposed routes as given below. As you can see, they intend to criss-cross the country with high speed service. Very aggressive.
This IS a project to watch and with today being the first test there is cause to be optimistic. Let’s wish Mr. Musk and his team the very best of luck.
May 1, 2016
As you probably know, I don’t “DO” politics. I stay with STEM (Science, Technology, Engineering and Mathematics). In other words, subjects I actually know something about. With that being the case, I do feel the technical community must have definite opinions relative to pronouncements made by our politicians. Please keep in mind; most politicians have other than technical degrees so they are dependent upon input from individuals in the STEM professions. That’s really what this post is about—opinions relative to Senator Sander’s Energy Plan. (NOTE: My facts are derived from Senator Sander’s web site and Design News Daily Magazine. Mr. Charles Murray wrote an article in March detailing several points of Sander’s plan. )
Sanders’ ideas seemingly represent a growing viewpoint with the American population at large. He fared fairly well in the Iowa caucuses and won the New Hampshire primary election although history indicates he will not be the Democratic candidate facing the GOP representative unless Secretary Clinton is indicted by the FBI. I personally feel this has a snowball’s chance of happening. Sanders’ popularity provides an opportunity for engineers to weigh in on some of the hard issues facing the country in the energy arena. We want to know: How do seasoned engineers react to some of his ideas? Let’s look first at a brief statement from “Bernie” relative to his ideas on energy.
“Right now, we have an energy policy that is rigged to boost the profits of big oil companies like Exxon, BP, and Shell at the expense of average Americans. CEO’s are raking in record profits while climate change ravages our planet and our people — all because the wealthiest industry in the history of our planet has bribed politicians into complacency in the face of climate change. Enough is enough. It’s time for a political revolution that takes on the fossil fuel billionaires, accelerates our transition to clean energy, and finally puts people before the profits of polluters.”
— Senator Bernie Sanders
Bernie’s comprehensive plan to combat climate change and insure our planet is habitable and safe for our kids and grandkids will:
- Cut U.S. carbon pollution by forty percent (40%) by 2030 and by over eighty percent (80%) by 2050 by 1.) putting a tax on carbon pollution, 2.) repealing fossil fuel subsidies and 3.) Making massive investments in energy efficiency and clean, sustainable energy such as wind and solar power.
- Create a Clean-Energy Workforce of ten (10) million good-paying jobs by creating a one hundred percent (100%) clean energy system. Transitioning toward a completely nuclear-free clean energy system for electricity, heating, and transportation is not only possible and affordable it will create millions of good jobs, clean up our air and water, and decrease our dependence on foreign oil.
- Return billions of dollars to consumers impacted by the transformation of our energy system and protect the most vulnerable communities in the country suffering the ravages of climate change. Bernie will tax polluters causing the climate crisis, and return billions of dollars to working families to ensure the fossil fuel companies don’t subject us to unfair rate hikes. Bernie knows that climate change will not affect everyone equally – disenfranchised minority communities and the working poor will be hardest hit. The carbon tax will also protect those most impacted by the transformation of our energy system and protect the most vulnerable communities in the country suffering the ravages of climate change.
- Acceleration Away from Fossil Fuels. Sanders proposes a carbon tax that he believes would reduce carbon pollution 40% by 2030 and 80% by 2050. He also wants to ban Arctic oil drilling, ban offshore drilling, stop pipeline projects like the Keystone XL, stop exports of liquefied natural gas and crude oil, ban fracking for natural gas, and ban mountaintop removal coal mining. Ban fossil fuels lobbyists from working in the White House. Massive lobbying and unlimited super PAC donations by the fossil fuel industry gives these profitable companies disproportionate influence on our elected leaders. This practice is business as usual in Washington and it is not acceptable. Heavy-handed lobbying causes climate change skepticism. It has no place in the executive office.
- Investment in Clean Sustainable Energy. Sanders proposes investments in development of solar, wind, and geothermal energy plants, as well as cellulosic ethanol, algae-based fuels, and energy storage. As part of his move to cleaner energy sources, he is also calling for a moratorium on nuclear power plant license renewals in the US.
- Revolutionizing of Electric Transportation Infrastructure. To begin ridding the country of tailpipe emissions, Sanders wants to build electric vehicle charging stations, as well as high-speed passenger rail and cargo systems. Funds, he says, would also be needed to update and modernize the existing energy grid. Finally, he is calling for extension of automotive fuel economy standards to 65 mpg, instead of the planned 54.5 mpg, by 2025.
- Reclaiming of Our Democracy from the Fossil Fuel Lobby. Sanders wants to ban fossil fuel lobbyists from the White House. More importantly, he is proposing a “climate justice plan” that would bring deniers to justice “so we can aggressively tackle climate change.” He has already called for an investigation of Exxon Mobil, his website says.
COMMENTS FROM ENGINEERS:
- As engineers we should recognize the value of confronting real problems rather than dwelling on demagoguery. Go Bernie. This comment is somewhat generic but included because there is an incredible quantity of demagoguery in political narrative today. Most of what we here is without specifics.
- “Without fuel, we have no material or energy to manufacture anything. Plastics, fertilizer (food), metals, medicine –- all rely on fuel … We are not going to reduce our need for fuel by eighty percent (80%) without massive technology breakthroughs.” I might add, those breakthroughs are decades away from being cost effective.
- “I like the idea of renewable energy and I think there are many places in which we are on the right track. A big question is how fast it takes to get there. The faster the transition, the more pain will occur … The slower the transition, the more comfortably we’ll all be able to adapt.”
- “Imagine if we had rolling power outages throughout the United States on a daily basis because of the shutdown of coal or nuclear power plants.”
- Another engineer wrote that “the actual numbers of death and cancer risks associated with all the nuclear disasters from Three Mile Island to (Chernobyl) and the Fukushima plant pale in comparison to the result of death and misery of coal and fossil fuel power plants supplying most of our electricity today and for the foreseeable future.”
- Another commenter said that “for Sanders to rid the US of fossil fuels, he must be one hundred percent (100%) in favor of nuclear energy. No amount of wind, solar, or geothermal will ever replace an ever-growing energy need.”
- Little or no attention in the forum was paid to the issue of intermittency –- in particular, whether a grid that’s heavy in renewables would be plagued by intermittency problems and, if so, how that might be solved. Intermittent problems where no electrical power will NOT be tolerated by the US population. I think that’s a given. We are dependent upon electrical energy. This certainly includes needed security.
As a parting shot we read: “I am suggesting that folks carefully examine the record of those yelling the loudest, and then decide what to believe,” noted reader William K. “As engineering professionals, we should always be examining the history as well as the current.”
I would offer a sanity check: WE WILL NEVER COMPLETELY REMOVE OURSELVES FROM THE PRODUCTS PROVIDED BY FOSSIL FUELS. We must get over it. As always, I welcome your comments.
April 18, 2016
OK, I know you are aware of the acronym—STEM, but let’s refresh.
Now that that’s over with. The development of the microchip and integrated circuitry gave rise to our digital age. It seems that the integrated circuit was destined to be invented. Two separate inventors, unaware of each other’s activities, invented almost identical integrated circuits or ICs at nearly the same time.
Jack Kilby, an engineer with a background in ceramic-based silk screen circuit boards and transistor-based hearing aids, started working for Texas Instruments in 1958. Mr. Kilby holds patents on over sixty inventions and is well known as the inventor of the portable calculator (1967). In 1970 he was awarded the National Medal of Science. A year earlier, research engineer Robert Noyce co-founded the Fairchild Semiconductor Corporation. Mr. Noyce, with sixteen patents to his name, also founded Intel, the company responsible for the invention of the microprocessor, in 1968. From 1958 to 1959, both electrical engineers were working on an answer to the same dilemma: how to make more from less.
In 1961 the first commercially available integrated circuits came from the Fairchild Semiconductor Corporation. All computers at that time were made using chips instead of the individual transistors and their accompanying parts. Texas Instruments first used the chips in Air Force computers and the Minuteman Missile in 1962. They later used chips to produce the first electronic portable calculators. The original IC had only one transistor, three resistors and one capacitor and was the size of an adult’s pinkie finger. Today, an IC smaller than a penny can hold 125 million transistors.
For both men the invention of the integrated circuit stands historically as one of the most important innovations of mankind. Almost all modern products use chip technology. The invention of the chip ushered in the digital age and the age of STEM.
Over the past ten years, jobs in the STEM professions have grown three times faster than non-STEM jobs and are projected to grow seventeen percent (17%) through 2018 as compared to nine point eight percent (9.8%) for all other occupations. This should indicate that there is room for everyone, not just men, not just white men, but women, African-American, Asians, Hispanics, etc and it will take all interested parties to fill the upcoming need for trained professionals. With this being the case, colleges and universalities across the United States have been working to attract more women into STEM professions.
The Girl Scouts of America published a study entitled “Generation STEM” involving a questionnaire asking what girls say about the STEM professions. They found that teenage girls love STEM, with seventy-four percent (74%) of high school girls across the country being very interested in STEM-related professions. This definitely runs counter to several negative stereotypes that persist about young ladies and their interest in scientific or mathematic pursuits. Let’s now look at several facts. The digital photograph below has several surprising conclusion.
Now, I would be remiss if I did not indicate several difficult aspects of women joining the scientific and engineering community. There are challenges, as follows:
Challenge 1: Shortage of mentors for women in STEM fields.
Women tend to have a harder time finding female mentors in STEM occupations. A more experienced employee can show you the ropes and promote your accomplishments. This is important for anyone in any career. It is especially important for women in STEM, because they are often less likely than their male coworkers to promote themselves. As you can see above, many women fully qualified in their fields of study leave their professions due to pressures from other than ability.
Solution 1: If you can’t find a mentor in your organization, join a professional association.
Many associations like the Association for Women in Science, the Society of Women Engineers, and the Association for Women in Mathematics. All have networking and mentoring opportunities (both online and in person).
Challenge 2: Lack of acceptance from coworkers and supervisors.
If you work in a STEM field, you might work mainly or exclusively with men. You may find it difficult to be accepted as part of the group. There’s legal help if you face sexual harassment or discrimination in hiring and pay. It’s not always easy to know what to do about subtle or unintentional exclusion. This really surprised me when I read it. In the engineering teams I have been associated with, all lady members were treated with respect and as absolute equals. Apparently, this is not always the case.
Solution 2: Work for a company with female-friendly policies and programs.
Many companies understand that it’s profitable to keep their talented female employees happy. They make special efforts to recruit women. They move them into leadership positions and offer flexible work or mentoring programs. Take time to research potential employers. Find out if they understand and want to reduce the challenges for women working in male-dominated occupations.
Challenge 3: Coping with gender differences in the workplace.
Let’s face it: men and women have different interaction styles. This plays itself out at work. If you’re a woman working mostly with men, your daily reality will be different than if you were in a female-dominated workplace.
Solution 3: Educate yourself.
Read up on gender differences in communication. Learn what to expect by talking to women in STEM fields who can share insights. Don’t wait to be asked before offering an opinion. Learn how to handle mistakes, blame, and guilt in a male-dominated workplace. Learn the art of saying no to unreasonable requests.
One problem that affects both men and women is preparedness relative to their high school years. Our country is just not producing students for the rigors of the STEM professions. They are simply not prepared to move into fields of study that will ultimately see them graduate with a four year degree and move into technology. The chart below indicates some of the disturbing problems we have as a nation.
- Computer scientists are in high demand, but only a fraction of U.S. high schools offer advanced training on the subject—and that fraction is shrinking.
- Of the more than 42,000 public and private high schools in the United States, only 2,100 high schools offered the Advanced Placement test in computer science last year, down 25 percent over the past five years, according to a recent report by Microsoft.
- In schools where computer science is offered, it often does not count toward graduation. Only nine states—Georgia, Missouri, New York, North Carolina, Oklahoma, Oregon, Rhode Island, Texas, and Virginia—allow computer science courses to satisfy core math or science requirements, according to the report. (This is ridiculous!)
- With an estimated 120,000 new jobs requiring a bachelor’s degree in computer science expected in the next year alone, and nearly 3.7 million jobs in STEM fields currently sitting unfilled, computer science is the future. This is, for the most part, due to students being unprepared right out of high school. Before students can gain access to these courses, schools need teachers qualified to teach them. And districts with dwindling budgets and restrictive pay structures are competing with the likes of Microsoft, Google, and Facebook for talent. One of the fundamental things we need to do is rethink the way that we recruit, retain, and compensate teachers to be able to deal with this changing labor market.
- Over the past ten years, the percentage of ACT-tested students who said they were interested in majoring in engineering has dropped steadily from 7.6 percent to 4.9 percent.
- Over the past five years, the percentage of ACT-tested students who said they were interested in majoring in computer and information science has dropped steadily from 4.5 percent to 2.9 percent.
- Fewer than half (41 percent) of ACT-tested 2005 high school graduates achieved or exceeded the ACT College Readiness Benchmark in Math.
- Only a quarter (26 percent) of ACT-tested 2005 high school graduates achieved or exceeded the ACT College Readiness Benchmark in Science.
- In the graduating class of 2005, just slightly more than half (56%) of ACT-tested students reported taking the recommended core curriculum for college-bound students: four years of English and three years each of math (algebra and higher), science, and social studies.
What can be done?
- Align rigorous, relevant academic standards—across the entire K–16 system—that prepare all students for further education and work.
- Establish a common understanding among secondary and postsecondary educators and business leaders of what students need to know to be ready for college and workplace success in scientific, technological, engineering, and mathematical fields.
- Evaluate and improve the alignment of K–12 curriculum frameworks in English/language arts, mathematics, and science to ensure that the important college and work readiness skills in STEM fields are being introduced, reaffirmed, and mastered at the appropriate times.
- Raise expectations that all students need strong skills in mathematics, science, and technology and that all students can meet rigorous college and workplace readiness standards.
- Require all high school students to take at least three years of rigorous, specific college-preparatory course sequences in math and science.
- Recruit, train, mentor, motivate, reward, and retain highly qualified mathematics, science, and technology professionals to teach in middle school and beyond.
- Ensure that every student has the opportunity to learn college readiness skills and has access to key courses in the STEM fields.
- Evaluate and improve the quality and intensity of all STEM core and advanced courses in high schools to ensure both greater focus on in-depth content and greater secondary-to-postsecondary curriculum alignment.
- Sponsor model demonstration programs that develop and evaluate a variety of rigorous science, mathematics, and technology courses and end-of-course assessments for all students.
- Provide opportunities for dual enrollment, distance learning, and other enrichment activities that will expand opportunities for students to pursue advanced coursework in STEM areas.
- Establish and support model programs that identify students with STEM academic potential and interests and expose them to STEM opportunities.
- Include parents, teachers, and counselors in outreach programs that help them learn about STEM professions so they can encourage students to go into those fields.
- Initiate new and expand existing scholarship programs to attract more students into STEM fields.
- Assess foundational science and math skills in elementary school to identify students who are falling behind while there is still time to intervene and strengthen their skills.
- Identify and improve middle and high school student readiness for college and work using longitudinal student progress assessments that include science and mathematics components.
- Establish and support model programs that utilize end-of-course assessments for STEM courses to ensure rigor and effectiveness.
- Incorporate college and workforce readiness measures into federal and statewide school improvement systems.
If a rising tide floats all boats, improvements in high school science and mathematics will attract more ladies into the STEM professions. Everyone benefits.
As always, I welcome your comments.