I remain absolutely amazed at the engineering effort involving the space probe NASA calls “NEW HORIZONS”.  The technology, hardware, software and communication package allowing the flyby is truly phenomenal—truly.  One thing that strikes me is the predictability of planetary movements so the proper trajectory may be accomplished.   Even though we live in an expanding universe, the physics and mathematics describing planetary motion is solid.  Let us take a very quick look at several specifics.

THE MISSION:

PROJECT

SPECIFICS:

  • LAUNCH:  January 19, 2006
  • Launch Vehicle:  Atlas V 551, first stage: Centaur Rocket, second stage: STAR 48B solid rocket third stage
  • Launch Location:  Cape Canaveral Air Force Station, Florida
  • Trajectory:  To Pluto via Jupiter Gravity Assist
  • The teams had to hone the New Horizons spacecraft’s 3 billion plus-mile flight trajectory to fit inside a rectangular flyby delivery zone measuring only 300 kilometers by 150 kilometers. This level of accuracy and control truly blows my mind.
  • New Horizon used both radio and optical navigation for the journey to Pluto.  Pluto is only about half the size of our Moon and circles our Sun roughly every 248 years. (I mentioned predictability earlier.  Now you see what I mean. )
  • The New Horizon craft is traveling 36,373 miles per hour and has traversed 4.67 billion miles in nine (9) years.
  • New Horizon will come as close as 7,800 miles from the surface of Pluto.
  • Using LORRI (Long Range Reconnaissance Imager) — the most crucial instrument for optical navigation on the spacecraft; the New Horizon team took short 100 to 150 millisecond exposures to minimize image smear. Such images helped give the teams an estimate of the direction from the spacecraft to Pluto.
  •  The photographs from the flyby are sensational and very detailed relative to what was expected.
  • The spacecraft flew by the Pluto–Charon system on July 14, 2015, and has now completed the science of its closest approach phase. New Horizons has signaled the event by a “phone home” with telemetry reporting that the spacecraft was healthy, its flight path was within the margins, and science data of the Pluto–Charon system had been recorded.

HARDWARE:

The hardware for the mission is given with the graphic below.  From this pictorial we see the following sub-systems:

  • PEPSSI
  • SWAP
  • LORRI
  • SDC
  • RALPH
  • ALICE
  • REX(HGA)

The explanation for each sub-system is given with the graphic.   As you can see:  an extremely complex piece of equipment representing many hours of engineering design and overall effort.

 

HARDWARE

GOALS FOR THE MISSION:

The goal of the mission is to understand the formation of the Pluto system, the Kuiper belt, and the transformation of the early Solar System.  This understanding will greatly aid our efforts in understanding how our own planet evolved over the centuries.  New Horizon will study the atmospheres, surfaces, interiors and environments of Pluto and its moons.  It will also study other objects in the Kuiper belt.  By way of comparison, New Horizons will gather 5,000 times as much data at Pluto as Mariner did at Mars.  Combine the data from New Horizons with the data from the Mariner mission and you have complementary pieces of a fascinating puzzle.

Some of the questions the mission will attempt to answer are: What is Pluto’s atmosphere made of and how does it behave?  What does its surface look like? Are there large geological structures? How do solar wind particles interact with Pluto’s atmosphere?

Specifically, the mission’s science objectives are to:

  • map the surface composition of Pluto and Charon
  • characterize the geology and morphology of Pluto and Charon
  • characterize the neutral atmosphere of Pluto and its escape rate
  • search for an atmosphere around Charon
  • map surface temperatures on Pluto and Charon
  • search for rings and additional satellites around Pluto
  • conduct similar investigations of one or more Kuiper belt objects

NOTE:  Charon is also called (134340) Pluto I and is the largest of the five known moons of Pluto.  It was discovered in 1978 at the United States Naval Observatory in Washington, D.C., using photographic plates taken at the United States Naval Observatory Flagstaff Station (NOFS). It is a very large moon in comparison to its parent body, Pluto. Its gravitational influence is such that the center of the Pluto–Charon system lies outside Pluto.

HISTORY:

When it was first discovered, Pluto was the coolest planet in the solar system. Before it was even named, TIME that “the New Planet,” 50 times farther from the sun than Earth, “gets so little heat from the sun that most substances of Earth would be frozen solid or into thick jellies.”

The astronomer Clyde W. Tombaugh, then a 24-year-old research assistant at the Lowell Observatory in Flagstaff, Ariz., was the first to find photographic evidence of a ninth planet on this day, February 18, 85 years ago.  His discovery launched a worldwide scramble to name the frozen, farthest-away planet. Since the astronomer Percival Lowell had predicted its presence fifteen (15) years earlier, per TIME, and even calculated its approximate position based on the irregularity of Neptune’s orbit, the team at Lowell Observatory considered his widow’s suggestion of “Percival,” but found it not quite planetary enough. The director of the Harvard Observatory suggested “Cronos,” the sickle-wielding son of Uranus in Greek myth.  But the team opted instead for “Pluto,” the Roman god of the Underworld — the suggestion of an 11-year-old British schoolgirl who told the BBC she was enthralled with Greek and Roman mythology. Her grandfather had read to her from the newspaper about the planet’s discovery, and when she proposed the name, he was so taken with it that he brought it to the attention of a friend who happened to be an astronomy professor at Oxford University. The Lowell team went for Pluto partly because it began with Percival Lowell’s initials.

Pluto the Disney dog, it should be noted, had nothing to do with the girl’s choice. Although the cartoon character also made its first appearance in 1930, it did so shortly after the planet was named, as the BBC noted. While Pluto was downgraded to “dwarf planet” status in 2006, it remains a popular subject for astronomers. They began discovering similar small, icy bodies during the 1990s in the same region of the solar system, which has become known as the Kuiper Belt. Just because Pluto’s not alone doesn’t make it any less fascinating, according to Alan Stern, director of a NASA mission, New Horizons that will explore and photograph Pluto in an unprecedented spacecraft flyby on July 14 of this year.

“This epic journey is very much the Everest of planetary exploration,” Stern wrote in TIME last month. “Pluto was the first of many small planets discovered out there, and it is still both the brightest and the largest one known.”

NASA released its first images of Pluto from the New Horizons mission earlier this month, although the probe was still 126 million miles away from its subject; the release was timed to coincide with Tombaugh’s birthday. Stern wrote, when the pictures were released, “These images of Pluto, clearly brighter and closer than those New Horizons took last July from twice as far away, represent our first steps at turning the pinpoint of light Clyde saw in the telescopes at Lowell Observatory eighty-five (85) years ago, into a planet before the eyes of the world this summer.”

CONCLUSION:

AMAZING ENGINEERING ACCOMPLISHMENT!


The United States has longed for energy independence for years now.  The need to lessen or eliminate reliance on foreign sources for petroleum products by developing alternate fuels is now coming to fruition.  The question is: Will compressed natural gas be a future source of energy for the internal combustion engine?  Resources Magazine thinks so.  Let’s take a quick look at the assessment from Alan J. Krupnick, Senior Fellow and Co-Director, RFF’s Center for Energy and Climate Economics.

“Natural gas holds the promise of reducing carbon emissions and dependence on oil. But until recently, it was an also-ran in the sweepstakes for transforming fuel costs and transportation in the United States. The new abundance of domestically available shale gas and continuingly high gasoline and diesel prices could change that. Will these developments be enough to extend the reach of natural gas vehicles beyond buses, garbage trucks, and delivery trucks?”

I feel his conclusions indicate CNG is a very viable alternative for local delivery vans and trucks as well as “the big rigs”.  Other information substantiates his conclusion.  From this, we can see the following.

Industry Analysis

The CNG market has grown at the rate of 3.7% since 2000. The market for these products has experienced slow growth to due to: 1.) availability of the products, 2.) heat build-up during the compression process, 3.) time delays in the refilling process and 4.) the expense of locating CNG at the market locations. The areas of greatest growth in the CNG market are in the area of transporters that possess fleets (Tractor Trailers), Straight Trucks, and Public Transportation such as school and/or city buses. California and Texas lead the way with CNG fueling stations on a national level. There are approximately 1,300 CNG fueling stations in the US today; however, 730 are public stations with the remainder private fleet stations. There are currently less than 10 public CNG filling stations within the Tri-State area of Tennessee, Georgia, and Alabama. Southeast Tennessee currently has no CNG fueling stations. The industry is rapidly changing as the 2014 EPA NHTSA Heavy Duty Truck Program has been put in place by president Obama. This legislation has forced fleet and fuel managers to reduce emissions as well as increase fuel efficiency. Small savings have been made by reducing drag, adequate tire pressure, and reduced idling practices. CNG is a “game changing” modification that addresses the new standards that are currently in place as well as future reductions that are scheduled for 2018. We will adopt a customer centric approach that addresses the needs of the immediate market based on available original equipment and after market manufacturers. Some industry pundits have estimated CNG will realize 25% annual growth for the next 5 to 10 years on a conservative level.

Market Segment

Key points in defining the market segment for CNG are existing markets and projected future markets. Electric power and industrial markets make up almost 60% of the current consumer market. Existing markets include the fields of Agriculture, Industrial, and Motor Fuel in a static environment. Projected markets include opportunities in a more mobile environment. Transportation appears to be the most likely segment to grow as it makes up less than 1% of total natural gas used. Currently, the market is distributed with limited, if any, diversity of participants. Trending for share gains and losses typically represents large potential for gains across the entire industry. Share losses are predominantly absorbed by the diesel fuel and propane distributors, as recent supply shortages have clearly proven in the motor fuel and poultry industries. Market share will be lost by the above mentioned industries due to loss of confidence by the respective customer bases. The current and projected trends in the motor fuel industry are now driven by the Tier II Fuel Initiative causing off road diesel fuel to be banned in the near future. The result of the ban will continue to be increases in motor fuel pricing. As motor fuel costs increase, CNG becomes not only the clean alternative fuel replacement, but also the affordable alternative. CNG cuts the cost of a diesel equivalent gallon by as much as 50% based on the volatile and often fluctuating diesel market. Also, CNG is a much more effective fuel in cold weather areas as opposed to diesel and the multiple problems which exist.

The implied trends in the propane and agricultural industries currently indicate an extended, long-term propane supply shortage. The result is that CNG becomes the efficient, clean energy solution by cutting propane costs by 25 to 50%. Users of CNG are looking for quality and productivity improvements. The history of CNG development has resulted in the need for creative technology solutions that enable the full application of the CNG Natural Gas Industry. Recent patenting and innovation that Cielo has identified allows CAF to operate more efficiently than diesel or propane. The stability of this market segment is solid, based on CNG product category performance over the past two years. The forecasters predict an exponential growth over the next two years.

CNG STATION:

With this in mind, Cielo Technologies, LLC has entered into a partnership to “sink” one CNG station in the Chattanooga area.  Land has been purchased, layouts determined, zoning completed, and site preparation underway.  Right now, the area selected does not look like much.  The following JPEGs will illustrate that fact.  I intend to give you progress reports as we erect the facility and hopefully in five months, show you the completed and operating compound.  Let’s take a very quick look at the site itself.

ENTERANCE DRIVE

The first digital shows the proposed entry to the station itself.  As I mentioned, not much to look at and definitely needs considerable attention—that attention is on the way.

EXIT DRIVE

This is the proposed exit from the facility.  We feel less confusion will be the order of the day if we have one way in and one way out.

GROUND SITE

There will be three (3) pumping stations installed on a concrete island located left to right on the JPEG above.  Room enough for three “18 wheelers”.

LOCATION OF PUMPING STATION(2)

Another look at the pumping station locations.  The CNG compressors and storage will be to the right of the pumping stations.  All piping will be underground and unexposed to the elements.  We opted to go hard-wire instead of Wi-Fi due to possible interruption of service.

DARK NET

June 25, 2015


Most of the individuals who read my posting are very well-informed and know that Tim Berners-Lee “invented” the internet.  In my opinion, the Internet is a resounding technological improvement in communication.  It has been a game-changer in the truest since of the word.  I think there are legitimate uses which save tremendous time.  There are also illegitimate uses as we shall see.

A JPEG of Mr. Berners-Lee is shown below:

Tim B-L

BIOGRAPHY:

In 1989, while working at CERN, the European Particle Physics Laboratory in Geneva, Switzerland, Tim Berners-Lee proposed a global hypertext project, to be known as the World Wide Web. Based on the earlier “Enquire” work, his efforts were designed to allow people to work together by combining their knowledge in a web of hypertext documents.  Sir Tim wrote the first World Wide Web server, “httpd“, and the first client, “WorldWideWeb” a what-you-see-is-what-you-get hypertext browser/editor which ran in the NeXTStep environment. This work began in October 1990.k   The program “WorldWideWeb” was first made available within CERN in December, and on the Internet at large in the summer of 1991.

Through 1991 and 1993, Tim continued working on the design of the Web, coordinating feedback from users across the Internet. His initial specifications of URIs, HTTP and HTML were refined and discussed in larger circles as the Web technology spread.

Tim Berners-Lee graduated from the Queen’s College at Oxford University, England, in 1976. While there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television.

He spent two years with Plessey Telecommunications Ltd (Poole, Dorset, UK) a major UK Telecom equipment manufacturer, working on distributed transaction systems, message relays, and bar code technology.

In 1978 Tim left Plessey to join D.G Nash Ltd (Ferndown, Dorset, UK), where he wrote, among other things, typesetting software for intelligent printers and a multitasking operating system.

His year and one-half spent as an independent consultant included a six-month stint (Jun-Dec 1980) as consultant software engineer at CERN. While there, he wrote for his own private use his first program for storing information including using random associations. Named “Enquire” and never published, this program formed the conceptual basis for the future development of the World Wide Web.

From 1981 until 1984, Tim worked at John Poole’s Image Computer Systems Ltd, with technical design responsibility. Work here included real time control firmware, graphics and communications software, and a generic macro language. In 1984, he took up a fellowship at CERN, to work on distributed real-time systems for scientific data acquisition and system control. Among other things, he worked on FASTBUS system software and designed a heterogeneous remote procedure call system.

In 1994, Tim founded the World Wide Web Consortium at the Laboratory for Computer Science (LCS). This lab later merged with the Artificial Intelligence Lab in 2003 to become the Computer Science and Artificial Intelligence Laboratory (CSAIL) at the Massachusetts Institute of Technology (MIT). Since that time he has served as the Director of the World Wide Web Consortium, a Web standards organization which develops interoperable technologies (specifications, guidelines, software, and tools) to lead the Web to its full potential. The Consortium has host sites located at MIT, at ERCIM in Europe, and at Keio University in Japan as well as offices around the world.

In 1999, he became the first holder of 3Com Founders chair at MIT. In 2008 he was named 3COM Founders Professor of Engineering in the School of Engineering, with a joint appointment in the Department of Electrical Engineering and Computer Science at CSAIL where he also heads the Decentralized Information Group (DIG). In December 2004 he was also named a Professor in the Computer Science Department at the University of Southampton, UK. From 2006 to 2011 he was co-Director of the Web Science Trust, launched as the Web Science Research Initiative, to help create the first multidisciplinary research body to examine the Web.

In 2008 he founded and became Director of the World Wide Web Foundation.  The Web Foundation is a non-profit organization devoted to achieving a world in which all people can use the Web to communicate, collaborate and innovate freely.  The Web Foundation works to fund and coordinate efforts to defend the Open Web and further its potential to benefit humanity.

In June 2009 then Prime Minister Gordon Brown announced that he would work with the UK Government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force. Sir Tim was a member of The Public Sector Transparency Board tasked to drive forward the UK Government’s transparency agenda.  He has promoted open government data globally, is a member of the UK’s Transparency Board.

In 2011 he was named to the Board of Trustees of the Ford Foundation, a globally oriented private foundation with the mission of advancing human welfare. He is President of the UK’s Open Data Institute which was formed in 2012 to catalyze open data for economic, environmental, and social value.

He is the author, with Mark Fischetti, of the book “Weaving the Web” on the past, present and future of the Web.

On March 18 2013, Sir Tim, along with Vinton Cerf, Robert Kahn, Louis Pouzin and Marc Andreesen, was awarded the Queen Elizabeth Prize for Engineering for “ground-breaking innovation in engineering that has been of global benefit to humanity.”

It should be very obvious from this rather short biography that Sir Tim is definitely a “heavy hitter”.

DARK WEB:

I honestly don’t think Sir Tim realized the full gravity of his work and certainly never dreamed there might develop a “dark web”.

The Dark Web is the public World Wide Web content existing on dark nets or networks which overlay the public Internet.  These networks require specific software, configurations or authorization to access. They are NOT open forums as we know the web to be at this time.  The dark web forms part of the Deep Web which is not indexed by search engines such as GOOGLE, BING, Yahoo, Ask.com, AOL, Blekko.com,  Wolframalpha, DuckDuckGo, Waybackmachine, or ChaCha.com.  The dark nets which constitute the Dark Web include small, friend-to-friend peer-to-peer networks, as well as large, popular networks like FreenetI2P, and Tor, operated by public organizations and individuals. Users of the Dark Web refer to the regular web as the Clearnet due to its unencrypted nature.

A December 2014 study by Gareth Owen from the University of Portsmouth found the most commonly requested type of content on Tor was child pornography, followed by black markets, while the individual sites with the highest traffic were dedicated to botnet operations.  Botnet is defined as follows:

“a network of computers created by malware andcontrolled remotely, without the knowledge of the users of those computers: The botnet was usedprimarily to send spam emails.”

Hackers built the botnet to carry out DDoS attacks.

Many whistle-blowing sites maintain a presence as well as political discussion forums.  Cloned websites and other scam sites are numerous.   Many hackers sell their services individually or as a part of groups. There are reports of crowd-funded assassinations and hit men for hire.   Sites associated with Bitcoinfraud related services and mail order services are some of the most prolific.

Commercial dark net markets, which mediate transactions for illegal drugs and other goods, attracted significant media coverage starting with the popularity of Silk Road and its subsequent seizure by legal authorities. Other markets sells software exploits and weapons.  A very brief look at the table below will indicate activity commonly found on the dark net.

ACTIVITY

As you can see, the uses for the dark net are quite lovely, lovely indeed.  As with any great development such as the Internet, nefarious uses can and do present themselves.  I would stay away from the dark net.  Just don’t go there.  Hope you enjoy this one and please send me your comments.

 

WHERE THE JOBS ARE

May 1, 2015


The following data was taken from a survey done by nerdwallet.com:  Best Places for Engineers, 23 February 2015.

If you follow my postings you know I primarily concentrate on the STEM (science, technology, engineering and mathematics) professions. I track the job market relative to job availability and salary rates over the country and the world.  An online publication called NerdWallet recently published a very informative article on job availability for engineers.  Here is the methodology used to provide the results.

Methodology:

The overall score for each of the metro areas was calculated using the following measures:

  1. Engineers per 1,000 total jobs (50% of each overall score). Data is from the Bureau of Labor Statistics May 2013 Metropolitan and Nonmetropolitan Area Occupational Employment and Wage Estimates.
  1. Annual mean wage for engineering jobs (25% of each overall score). Data is from the Bureau of Labor Statistics May 2013 Metropolitan and Nonmetropolitan Area Occupational Employment and Wage Estimates.
  1. Median gross rent for each place (25% of each overall score). Data is from the 2013 U.S. Census Bureau American Community Survey.

This study analyzed 350 of the largest metro areas in the U.S.

The following engineering fields, as defined by the Bureau of Labor Statistics, were used to compound the data: aerospace, biomedical, chemical, civil, computer hardware, electrical, electronics, environmental, health and safety engineers, industrial, marine engineers and naval architects, materials, mechanical, mining and geological engineers and all other engineers.  This list just about covers the “waterfront” as far as working-class engineers.  Let’s take a look at the results.

List 1-10

List 11-20

In looking at the list above, we can make the following observations:

  • Eleven of the top twenty cities and areas are in the South. The list includes the following southern cities:
  1. Huntsville, Alabama

With a NASA flight center and an Army arsenal, Huntsville is nicknamed “The Rocket City” for good reason. Engineers make up 6% of its employed population and make nearly $103,000 a year, which is higher than the national mean. Median rent is the second lowest in our top 10, at around $725 a month. Huntsville, a northern Alabama city near the Tennessee border, is a hub for aerospace engineers.

  1. Warner Robins, Georgia

Drive 90 minutes south of Atlanta and you’ll hit Warner Robins, where nearly 4% of the working world is in engineering. Here you’ll find the Robins Air Force Base, which employees more than 25,000 people, and the Museum of Aviation, the second-largest museum in the nation’s Air Force. However, engineers in this area earn the lowest salary of our top 10, around $86,000 a year, which is lower than the national mean.

  1. Palm Bay-Melbourne-Titusville, Florida

Aside from ocean views, the Palm Bay-Melbourne-Titusville area offers career opportunities for engineers, who make up about 3% of the employed population, earn almost $94,000 a year and pay around $876 in rent. Harris Corp., a worldwide telecommunications company, and Intersil Corp., a semiconductor manufacturer, are headquartered in the area, employing thousands.

  1. Houston-Sugar Land-Baytown, Texas

In the Lone Star State’s most populated area, engineers earn their livelihood in the energy sector at companies including Phillips 66, Marathon Oil and Kinder Morgan. Engineers in this area make a mean salary of almost $123,000, which is the second highest in our top 20. This area also made our top 10 list of Best Places for STEM Graduates.

  1. Midland, Texas

As the saying goes, “Everything’s bigger in Texas,” including the engineering sector. Engineers here take home the largest salary of our top 20 — about $141,000 a year. Midland, with key industries including aerospace, oil and gas, has one of the lowest unemployment rates in the country, 2.6%, according to the U.S. Bureau of Labor Statistics.

  1. Decatur, Alabama

Just 25 miles west of our list’s leading place, Decatur engineers have access to many opportunities in Huntsville. But Decatur itself is home to a United Launch Alliance facility, where spacecraft launch equipment is manufactured. Engineers make up about 2% of Decatur’s workforce, making it the smallest engineering industry in our top 10. However, it still has more engineers per 1,000 employees than the national average.

  • All 20 locations have larger engineering industries than the national average of twelve (12) engineers for every 1,000 employees.
  • Engineers in thirteen (13) of the top twenty (20) places earn more than the national mean engineering salary, which is $92,170.
  •  Fourteen (14) locations have lower median rent than the average U.S. metro area, which is $905 a month.
  • A great deal of employment results from proximity to universities and military-industrial complexes although the “oil patch” certainly draws a great number of individuals in STEM professions.
  • There is a significant absence from areas of the northeast and the “rust belt”; i.e. the northern and mid-western states.

I also think certain factors such as lower taxes; less congestion during commute, milder climate, and lower cost of living contribute to overall reasons for companies locating in southern areas.

I hope you enjoyed this one. I will make every effort to keep this list current.  As always, I appreciate your comments.  Keep them coming.


Data for this post was taken from the following sources: 1.) Design News Daily, and 2.) Those references given on the individual slides.

I have been a “blue-collar” working engineer since graduation in 1966.  I think it’s a marvelous profession and tremendously rewarding.  I also find that engineering is one of the most trusted professions.  When you are designing a bridge, a machine, a biomedical device, etc. there is little room for PC.   Being politically correct will get you a bum design.  You design towards accomplishing an objective or satisfying a consume needs.  Also, you can’t talk your way into success.  You have to perform at every phase of the engineering program.  There are processes in place that aid our efforts along the way.  Some of these are as follows:

  • Six Sigma
  • Design for Six Sigma
  • QFD or Quality Functional Deployment
  • FMEA or Failure Mode Effect Analysis
  • Computational Fluid Dynamics
  • Reliability Engineering
  • HALT—Highly Accelerated Laboratory Testing
  • Engineering Reliability

There are others depending upon the branch of engineering in question.  There are also a large number of computer programs specifically written for each engineering discipline.

With that being the case, what would you say are the highest paying engineering salary levels by discipline?  You might be surprised.  I was.  The following slides basically speak for themselves and represent entry level, mid-level and high-paying salaries for graduate engineers.  Let’s take a look at the top ten (10).

BIOMEDICAL

I’m not surprised at biomedical engineering being in the top ten.  There is a huge demand for “bio-engineers” due to rapid advances in technology and significant needs relative to non-invasive medical investigations.

The next one, Civil Engineering, does surprise me a little although we live with a crumbling infrastructure.  Much more needs to be accomplished to redesign, replace and upgrade our roads, dams, bridges, levees, etc etc.  We are literally falling apart.CIVIL

The next two should not surprise anyone.  IT is driving innovation in our time and the need for computer programmers, hardware engineers and software engineers will only increase as time goes by.

COMPUTER ENGINERING HARDWARE

COMPUTER ENGINEERING SOFTWARE

Chemical engineering has always been one of the top engineering disciplines.  CEs can apply their “trade” to an extremely large number of endeavors.

CHEMICAL ENGINEERING

EE

During my time EEs were the highest paying jobs.  They still are.

Years ago, environmental engineering was included in the CE discipline. Today, it is important enough to stand alone and provide excellent salary levels.

ENVIRONMENTAL

GEOLOGY AND MINING

Geology and Mining engineering has taken off in recent years due to needs brought about by the oil industry.  More than ever, new sources of natural gas and oil are needed.  The term fracking was unknown ten and certainly twenty years ago.

Material Science is one of the most fascinating areas of investigation undertaken in today’s engineering world.  Composite structures, “additive” manufacturing, adhesives, and a host of other areas of materials engineering are producing needs throughout the profession.

Materials Science

MATERIALS AND SCIENCE

MECHANICAL

I am a mechanical engineer and greatly enjoy the work I do in designing work cells to automate manufacturing and assembly processes.  The field is absolutely wide open.

I hope you enjoy this very brief look at the top ten disciplines.  I also hope you will be encouraged to show this post to you children and grandchildren.  Explain what engineers do and how our profession benefits mankind.

EMBRAER

March 27, 2015


You know Dasher and Dancer and Prancer and Vixson, Gulfstream and Piper and Beechcraft and Cessna; but do you recall the least-known aircraft of all?  OK, so I’m not a poet or songwriter.  Have you ever heard of an aircraft manufacturer called EMBRAER?  Do you recognize their logotype?

LOGO

Well, I’ll bet you have flown on one of their aircraft.

HISTORY:

Embraer S.A. is a Brazilian aerospace conglomerate that produces commercial, military, executive and agricultural aircraft.  The company also provides corporate and private aeronautical services. It is headquartered in ão José dos Campos in the State of São Paulo.

On August 19, 1969, Embraer; (Empresa Brasileira de Aeronáutica S.A.) was created. With the support of the Brazilian government, the Company turned science and technology into engineering and industrial capacity. The Brazilian government was seeking a domestic aircraft manufacture thus making several investment attempts during the 1940s and ’50s to fulfill this need.    Its first president, Ozires Silva, was appointed by the Brazilian government to run the company.   EMBRAER initially produced one turboprop passenger aircraft, the Embraer EMB 110 Bandeirante, a project organized and executed by Ozires Silva. The first EMB 110 Bandeirante to be produced in series made its maiden flight on August 9, 1972. On the 19th of that same month, a public ceremony was held at the Embraer headquarters, attended by officials, employees and journalists from not only Brazil but several countries in South America. That aircraft is shown by the digital below.

40 Years Ago

By the end of the ‘70s, the development of new products, such as the EMB 312 Tucano and the EMB 120 Brasilia, followed by the AMX program in cooperation with Aeritalia (currently Alenia) and Aermacchi companies, allowed Embraer to reach a new technological and industrial level.  At exactly 8:44 AM, on April 8, 1982, the twin-engines EMB 121 Xingu PP-ZXA and PP-ZXB took off from São José dos Campos, piloted by Brasílico Freire Netto, Carlos Arlindo Rondom, Paulo César Schuler Remido and Luiz Carlos Miguez Urbano, en route to France. They were the first two aircraft of a total of forty-one (41) ordered by the French government for use in training military pilots from the Air Force (Armé de L’Air) and Naval Aviation (Aeronavale) department. The aircraft were delivered to the French authorities on April 16, at Le Bourget Airport.  That aircraft may be seen as follows:

Comissioned by the French

The EMB 120 Brasilia aircraft became an important milestone in the history of Embraer. Developed as a response to the evolving demands of the regional air transport industry, its design took advantage of the most advanced technologies available at the time. It was the fastest, lightest and most economical airplane in its category.  Most of the EMB 120s were sold in the United States and other destinations in the Western Hemisphere. Some European airlines such as Régional in France, Atlant-Soyuz Airlines in Russia, DAT in Belgium, and DLT in Germany also purchased EMB-120s. Serial production ended in 2001. As of 2007, it is still available for one-off orders, as it shares much of the production equipment with the ERJ-145 family, which is still being produced. The Angolan Air Force, for example, received a new EMB 120 in 2007.  If you’ve done much flying at all you probably have flown on the EMB 120. SkyWest Airlines operates the largest fleet of EMB 120s under the United Express and Delta Connection brand. Great Lakes Airlines operates six EMB 120s in its fleet, and Ameriflight flies eight as freighters.  This configuration has been a real short-haul workhorse. Another, and possibly better look, is as follows:

Air Moldova

COMMERCIAL LONG-HAUL:

Another workhorse is the EMBRAER 195.  That aircraft may be seen below.  It costs approximately $40 Million, which is just as expensive as the average narrow-body passenger jet and seats 108 passengers in a typical layout, 8 more than the average narrow-body passenger plane. The maximum seating capacity is 122 passengers in an all-economy class configuration.   The 195 uses roughly $11.64 worth of fuel per nautical mile flown (assuming $6 per gallon of jet fuel).  On a per-seat basis, this translates to being 7.3% more cost-efficient than the average aircraft.

A maximum range of 2,200 nautical miles (equal to 2,530 miles) makes this aircraft most appropriate for long domestic flights, or very short international flights.   With a service ceiling (max cruise altitude) of 41,000 feet, it is just slightly higher than the norm for this type of aircraft and can certainly get above most weather patterns along the flight route.

EMBRAER 195.doc

BUSINESS JET:

The Embraer EMB-505 Phenom 300 is a light jet aircraft developed by Embraer which can carry eight (8) or nine (9) occupants.  It has a flying range of 1,971 nmi (3,650 km) and carries a price estimate between US $ 5 million and US $ 8 million in 2012.

At 45,000 feet (14,000 m), the Phenom 300 is pressurized to a cabin altitude of 6,600 feet (2,000 m). The jet features single-point refueling and an externally serviced private rear lavatory, refreshment center and baggage area. It received FAA Type Certification on 14 December 2009 as the Embraer EMB-505.

On 29 December 2009 Embraer delivered the first Phenom 300 to Executive Flight Services at the company’s headquarters at São José dos Campos, Brazil.  In just four years, the Phenom 300 climbed to the top position on the list of most delivered business jets, with 60 units delivered in 2013. The Phenom 300 is the fastest seller in NetJets‘ inventory, counting thirty-six (36).  A beautiful aircraft with the ten (10)  most recent deliveries totaling $90 million. 

BUSINESS

MILITARY ISSUE:

Embraer has started work on modernizing a second production of Northrop F-5E fighters and F-model trainers for the Brazilian air force.

Three aircraft from a total of 11 are already being worked on at the company’s facilities in Gavião Peixoto, Brazil, with deliveries expected to start later this year. Embraer says it completed the delivery of a first batch of 46 modified F-5EM/FMs in 2012.  That aircraft is shown below.

Fighter

Both the modernized F-5M and AMX are being upgraded to a common avionics configuration. “What we are doing in Brazil is basically a commonality between the Super Tucano, F-5 and the AMX so that the pilots would not have many problems for transition,” Embraer says. “You also reduce costs and assist in training.”

The AMX and F-5 fleets are also receiving Elbit Systems-built radars, in addition to upgraded electronic warfare equipment, in-flight refueling systems and other improvements.

Meanwhile, the Brazilian navy is also upgrading its small fleet of 12 Douglas A-4 Skyhawk carrier-based light strike aircraft. At least one of the Skyhawks is currently being modernized at Gavião Peixoto, but Embraer could not immediately offer any details.

Alongside the modernization work for the Brazilian military, the factory at Gavião Peixoto is at work building a number of Super Tucanos for export customers in Angola and Indonesia.

Brazil is has previously increased spending on defense to prepare hosting the FIFA World Cup in 2014 and Olympic Games 2016 respectively.

There is also a growing realization in the country that it will have to work diligently in the future to protect its vast natural resources. This could unfortunately require military preparedness.

Another example of Embraer’s military ability may be seen from the following aircraft:

Heavy Duty Cargo Aircraft

The Embraer KC-390 is a medium-size, twin-engine jet-powered military transport aircraft now under development.  It is able to perform aerial refueling and to transport cargo and troops and will be the heaviest aircraft the company has in its inventory.  It will be able to transport up to 21 metric tons (23 short tons) of cargo, including wheeled armored fighting vehicles.

AGRICULTURAL:

The Ipanema is the market leader, with 50 years of continuous production and over 1,300 units sold, representing about 75% of the nation’s fleet in this segment.   The Ipanema agricultural aircraft is a leading agricultural market in Brazil, with about 60% share.  There has been 40 years of continuous production and constant research to improve the aircraft.  That concentration of effort always focused on the needs of the customers and the national agricultural market.  This brand demonstrates the reliability, solidity and tradition of Ipanema.  One other fact, the Ipanema is the first aircraft certified to fly powered solely by ethanol.  In addition to the economic advantages and obtained improvement in engine performance, ethanol is a renewable source of energy, which helps protect the environment.

Agricultural

CONCLUSION:

As you can see, the United States aircraft manufacturers do have competition and excellent competition at that.    This foreign entry keeps us on our toes.

FACIAL RECOGNITION

March 6, 2015


THE TECHNOLOGY:

Humans have always had the innate ability to recognize and distinguish between faces, yet computers only recently have shown the same ability and that ability results from proper software being installed into PCs with memory adequate to manipulate the mapping process.

In the mid 1960s, scientists began working to us computers to recognize human faces.  This certainly was not easy at first. Facial recognition software and hardware have come a long way since those fledgling early days and definitely involve mathematical algorithms.

ALGORITHMS;

An algorithm is defined by Merriam-Webster as follows:

“a procedure for solving a mathematical problem (as of finding the greatest common divisor) in a finite number of steps that frequently involves repetition of an operation; broadly :  a step-by-step procedure for solving a problem or accomplishing some end especially by a computer.”

Some facial recognition algorithms identify facial features by extracting landmarks, or features, from an image of the subject’s face. For example, an algorithm may analyze the relative position, size, and/or shape of the eyes, nose, cheekbones, and jaw. These features are then used to search for other images with matching features. Other algorithms normalize a gallery of face images and then compress the face data, only saving the data in the image that is useful for face recognition. A probe image is then compared with the face data. One of the earliest successful systems is based on template matching techniques applied to a set of salient facial features, providing a sort of compressed face representation.

Recognition algorithms can be divided into two main approaches, geometric, which looks at distinguishing features, or photometric, which is a statistical approach that distills an image into values and compares the values with templates to eliminate variances.

Every face has numerous, distinguishable landmarks, the different peaks and valleys that make up facial features. These landmarks are defined as nodal points. Each human face has approximately 80 nodal points. Some of these measured by the software are:

  • Distance between the eyes
  • Width of the nose
  • Depth of the eye sockets
  • The shape of the cheekbones
  • The length of the jaw line

These nodal points are measured thereby creating a numerical code, called a face-print, representing the face in the database.

In the past, facial recognition software has relied on a 2D image to compare or identify another 2D image from the database. To be effective and accurate, the image captured needed to be of a face that was looking almost directly at the camera, with little variance of light or facial expression from the image in the database. This created quite a problem.

In most instances the images were not taken in a controlled environment. Even the smallest changes in light or orientation could reduce the effectiveness of the system, so they couldn’t be matched to any face in the database, leading to a high rate of failure. In the next section, we will look at ways to correct the problem.

A newly-emerging trend in facial recognition software uses a 3D model, which claims to provide more accuracy. Capturing a real-time 3D image of a person’s facial surface, 3D facial recognition uses distinctive features of the face — where rigid tissue and bone is most apparent, such as the curves of the eye socket, nose and chin — to identify the subject. These areas are all unique and don’t change over time.

Using depth and an axis of measurement that is not affected by lighting, 3D facial recognition can even be used in darkness and has the ability to recognize a subject at different view angles with the potential to recognize up to 90 degrees (a face in profile).

Using the 3D software, the system goes through a series of steps to verify the identity of an individual.

 

The nodal points or recognition points are demonstrated with the following graphic.

POINTS OF RECOGNITION

This is where Machine Vision or MV comes into the picture.  Without MV, facial recognition would not be possible.  An image must first be taken, then that image is digitized and processed.

MACHINE VISION:

Facial recognition is one example of a non-industrial application for machine vision (MV).   This technology is generally considered to be one facet in the biometrics technology suite.  Facial recognition is playing a major role in identifying and apprehending suspected criminals as well as individuals in the process of committing a crime or unwanted activity.  Casinos in Las Vegas are using facial recognition to spot “players” with shady records or even employees complicit with individuals trying to get even with “the house”.   This technology incorporates visible and infrared modalities face detection, image quality analysis, verification and identification.   Many companies use cloud-based image-matching technology to their product range providing the ability to apply theory and innovation to challenging problems in the real world.  Facial recognition technology is extremely complex and depends upon many data points relative to the human face.

Facial recognition has a very specific methodology associated with it. You can see from the graphic above points of recognition are “mapped” highlighting very specific characteristics of the human face.  Tattoos, scars, feature shapes, etc. all play into identifying an individual.  A grid is constructed of “surface features”; those features are then compared with photographs located in data bases or archives.  In this fashion, positive identification can be accomplished. The graphic below will indicate the grid developed and used for the mapping process.  Cameras are also shown that receive the image and send that image to software used for comparisons.

MAPPING AND CAMERAS USED

One of the most successful cases for the use of facial recognition was last year’s bombing during the Boston Marathon.   Cameras mounted at various locations around the site of the bombing captured photographs of Tamerian and Dzhokhar Tsarnaev prior to their backpack being positioned for both blasts.  Even though this is not facial recognition in the truest since of the word, there is no doubt the cameras were instrumental in identifying both criminals.

TAMERIAN AND DZHOKHAR

Dzhokhar Tsarnaev is now the only of the court case that will determine life or death.  There is no doubt, thanks to MV, concerning his guilt or innocence.  He is guilty. Jurors in Boston heard harrowing testimony this week in his trial. Survivors, as well as police and first responders, recounted often-disturbing accounts of their suffering and the suffering of runners and spectators as a result of the attack. Facial recognition was paramount in his identification and ultimate capture.

As always, your comments are very welcome.

Follow

Get every new post delivered to your Inbox.

Join 122 other followers