Who Were the Titans of Telecommunication and Information Technology?

.

Multimedia Essay By: David Johanson Vasquez © All Rights – Second Addition – Series: 1 & 2 

. — Inventions are rarely the result of one individual’s work, but are created from collective efforts over time, from several individual’s observations, theories and experiments. Benjamin Franklin’s role in demystifying electricity, Michael Faraday’s discovery of “induced” current, Nikola Tesla and Guglielmo Marconi’s wireless radio communication… are just a few of the technology pioneers responsible for developing modern telecommunications. I regret not having the resources  for this program’s inclusion of all men and women, whose discoveries made telecommunication  and information technology possible.

Definition of technology — “the systematic application of scientific or other organized knowledge to practical tasks.”  (J.K Galbraith)  “the application of scientific and other organized knowledge to practical tasks by… ordered systems that involve people and machines.” (John Naughton) For an alternative graphic format on this essay:  www.ScienceTechTablet.wordpress.com                                                                                                                                                                                                                 Telecommunications took its first infant steps as the industrial revolution was rapidly compressing concepts of time and space. The first half of the 19THThe century witnessed modern society’s reliance on new innovations — steam locomotive trains for mass transit and electronic communication through telegraph technology. Steamships shrunk the world by delivering capital goods, raw resources and people to remote locations within fractions of the time it took before. With the industrial revolution nearing its peak at the close of the century, a new communication, innovation was developed, which helped transform the modern age into a postmodern era.

Inventor, Alexander Graham Bell’s Washington D.C. company, which developed the telephone, eventually evolved into a prime research laboratory. Bell’s vision for a R & D lab, created a foundation for the digital technologies of today. In the following century, another key, R & D technology titan— Xerox PARC enters the stage, which helps to set in motion personal computing and expands the information technology revolution.

The steamship S.S. Empress of India near Vancouver B.C.
From the private collection of: David A. Johanson ©

.  Scottish born Alexander Graham Bell From the collection of: Library of Congress

The French Technology Connection

A French, visionary government in 1880, recognized the importance of  Alexander Bell’s invention, and awarded him the Volta Prize. A sum of 50,000 francs or roughly, $ 250,000 in today’s currency came with the honor. The funds were reinvested into Bell’s laboratory for use in analysis, recording and transmission of sound. Growing proceeds from the lab were used for additional research and in education to enable knowledge on deafness.  

Can You Hear Me Now                                         

 The telegraph and telephone were the first forms of electrical, point-to-point telecommunications and qualify as early versions of social-media platforms. Over time, phone service, convenience and quality have steadily improved. In my youth during the early 1960s, I spent summers visiting relatives with farms in Wisconsin who had phones connected on “party lines” (several phone subscribers on one circuit).  When picking up a phone connected with a party line, your neighbor might be having a conversation in progress. If  a conversation was taking place you could politely interrupt and request to use the phone for urgent business. Today,  phone service has become so advanced that it is taken for granted as a form of personal utility.   In 1925, Bell Telephone Laboratories were created from a merger with the engineering department of American Telephone & Telegraph (AT&T) and Western Electric Research Laboratories.  Ownership of the lab was shared evenly between the two companies; in return, Bell Laboratories provided design and technical support for Western Electric’s telephone infrastructure used by the Bell System. Bell Labs completed the symbiotic relationship for the phone companies by writing and maintaining a full-spectrum of technical manuals known as Bell System Practices (BSP).     

An Invisible Bridge From Point A To Point B

Bell Laboratories instantly began developing and demonstrating for the first time, telecommunication technology, which we now depend on for economic growth and to hold our social fabric together. Bell accomplished the first transmitting of a long-distance, 128-line television images from New York to Washington, D.C. in 1927. This remarkable event ushered in television broadcast, creating a new form of mass-multimedia. Now people could gather together in the comfort of their homes and witness… live news reports, hours of entertainment and product advertisements, which helped to stimulate consumer spending in a growing economy.            Radio astronomy’s powerful space exploratory telescope, was developed through research conducted by Karl Jansky in 1931. During this decade, Bell lab’s George Paget Thomson was awarded the Nobel Prize in physics for his discovery of electron diffraction, which was a key factor for solid-state.

The Forecasting Power of Numerical Data

An important component of renewable energy is the photovoltaic cell, which was developed in the lab during the 1940s by Russell Ohl. A majority of the United States’  statistician superstars, such as W. Edwards Deming, Harold F. Dodge, George Edwards, Paul Olmstead and Mary N.Torrey all came from Bell Labs Quality Assurance Department. W. Edwards Deming’s genius would later go on to help revitalize Japan’s industry and be used in Ford Motors’ successful, quality control initiatives in the 1980s.

W. Edwards Deming

The U.S. government used Bell Labs for a series of consulting projects relating to highly technical initiatives and for the Apollo program. Several Nobel Prizes have been awarded to researchers at the laboratory, adding to its fame and growing prestige. In the 1940s many of the Bell Labs were moved from New York City to nearby areas of New Jersey. …………………………………. Replica of the first transistor.

Smaller Is Better In The World Of Electronics

Inventors of the transistor, l. to r. Dr. William Shockley, Dr. John Bardeen, Dr. Walter Brattain, ca. 1956 Courtesy Bell Laboratories Perhaps Bell Laboratories most marvelous invention was the transistor invented on December 16, 1947Transistors are at the heart of just about all electrical devices you’ll use today. These crucial artifacts transformed the electronics industry, by miniaturizing multiple electronic components used in an ever-expanding array of products and technical applications. Transistor efficiencies also greatly reduced the amount of heat in electronic devices, while improving overall reliability and efficiency compared to fragile vacuum tube components. Once more, the lab’s select team of scientist was rewarded with the Nobel Prize in Physics, for essential components of telecommunications. 

The mobile-phone was also created in 1947, with the lab’s commercial launch of Mobile Telephone Service (MTS) for use in automobiles. Some 20 years later, cell phone technology was developed at Bell Labs and went on to become the ubiquitous form of communication it is today. In 1954 the lab began to harness the sun’s potential, by creating the world’s first modern solar cell. The laser (Light Amplification by Stimulated Emission of Radiation) was dated in a Bell Lab, 1958 publication.  The laser’s growing spectrum of applications includes — communications, medical and consumer electronics.

A Perpetual Revolution In The Sky Unites The World

In 1962, Bell Labs pioneered satellite communications with the launch of Telstar 1, the first orbiting communication satellite. Telstar enabled virtually instant telephone calls to be bounced from coast to coast and throughout the world. This development unified global communications and provided instant 24-hour news coverage.      

 Bell Labs introduced the replacement of rotary dialing with touch-tone in 1963, this improvement vastly expanded telephone services with— 911 emergency response, voice mail and call service capabilities.

Image used in Byte Magazine for an article on VM2 assembly language. Photo-illustration by: David A. Johanson © All Rights

 

A New Distinct Language For Harnessing Machines

It’s been greatly underreported that Unix operating system, C and C++ programing languages,  essential for use in Information Technology (IT), were all created in Bell Labs. These crucial computer developments were established between 1969 and 1972, while C++ came later in the early 1980s. C programing was a breakthrough as a streamlined and flexible form of computer coding, making it one of the most widely used in today’s programing languages. Unix enabled comprehensive networking of diverse computing systems, providing for the internet’s dynamic foundation. Increasingly, Bell Laboratories inventions for the next two decades expanded micro-computing frontiers, which helped to establish personal computing.    

                                                                        In 1980, Bell Labs tested the first single-chip 32-bit microprocessor, enabling personal computers to handle complex multimedia applications.

 

A major corporate restructure of AT&T, the parent company of Bell Laboratories, was ordered  by the U.S.  Federal government in 1985, to split-up its subsidiaries as part of a  divestiture agreementThis event proved to be an example of overregulation, which severed important links for funding technology R&D projects. Although AT&T previously had an economic advantage with a monopoly in the telephone industry, it allowed for necessary funding of Bell R&D labs.  Indirectly, U.S. taxpayers made one of the best investments by subsidizing the foundation for our current telecommunication and information technology infrastructure. AT&T Bell Laboratories became AT&T Labs official new name in 1996, when it  became part of Lucent Technologies. Since 1996, AT&T Labs has been awarded over 2000 patens and has introduced hundreds of new products. In 2007, Lucent Bell and Alcatel Research merged into one organization under the name Bell Laboratories. Currently, the Labs’ purpose is directed away from science discovery and focussed on enhancing existing  technology, which will yield higher financial returns.

Pause & Reflect: Questions for continuous learning part 1.

1.) What were the first forms of electrical, point-to-point telecommunications? 2.) What revolution was taking place when early forms of telecommunications were invented and name at least two technology innovations? 3.) Define the word technology? 5.) Who founded Bell Research and Development Labs? 7.) Name at least two developments which Bell Labs were awarded Nobel Prizes in? 6.) Pick one Bell Lab invention, which you believe was most important for helping develop modern telecommunications or personal computing.

Any Sufficiently Advanced Technology Will Appear As Magic.

                                                                          — Arthur C. Clarke

 

Advance Technology Takes Root In The West

In the first half of the 20TH Century, Bell Labs’ dazzling R&D creations aligned seamlessly to establish a solid foundation in telecommunications. Most of the Labs’ bold research had been conducted in the industrialized, Eastern portion of the United States. By the 1950s, new evolving industries on the West Coast were benefiting from Bell’s technological developments. Palo Alto’s, Stanford University research facilities, south of San Francisco, attracted corporate transplants— most notably  IBM, General Electric and Eastman Kodak. In 1970, XEROX Corporation of Rochester, New York established a research center known as—Xerox PARC (Palo Alto Research Center Incorporated). PARC’s impact in R&D would soon be felt, acting as a stimulating catalyst for personal computing and information technology development.  

 Creative Sanctuary For Nurturing Daring Ideas

Jack GoldmanChief Scientist at Xerox enlisted physicist Dr. George Pake, a specialist in nuclear magnetic resonance to help establish a new Xerox research center. Selecting the Palo Alto location gave the scientist greater freedom than was possible near its Rochester headquarters. The location also provided huge resource opportunities to select talent pools of engineers and scientist from the numerous research centers located in the Bay Area. Once the West-Coast lab had a foothold, it became a sanctuary for the company’s creative misfits— passionate science engineers who were determined to create boldly. One of the few downsides for the new facility’s location was—less opportunities for lobbying and promoting critical breakthrough developments to top management located a continent away. XEROX PARC had an inspiring creative influence, along with universal appeal, which attracted international visitors. A collaborative, open atmosphere helps to define the creative legacy of PARC. The cross-pollination of ideas and published research between the R&D facility and Stanford’s computer science community, pushed digital innovation towards new thresholds.

A Premier Of Personal Computing Tools Is Unveiled

XEROX PARC, discovered a target rich environment of ideas from  Douglas Engelbart, who worked at Stanford Research Institute (SRI) in Menlo Park. Engelbart gave the Mother of all personal computing presentations in December of 1968, — astonishing the computer science audience with a remarkable debut of: the computer mouse, hypertext, email, video conferencing and much more. Bitmap graphic, graphical user interface (GUI), which provides window features and icons— are just a few of the revolutionary concepts developed by PARC for personal computing. The list of  PC  innovations and developments continues with laser printersWYSIWYG text editorInterPress (prototype of Postscript) and Ethernet as a local-area computer network—inspiring PARC Universal Packet architecture, which resembles today’s internet. Optical disc technologies and LCD, were developed by PARC material scientist adding yet more to its diverse technology portfolio.

 The Shape Of Things To Come

Xerox PARC’s R&D, efficiently blended these vital new technologies and leveraged it all into a personal computer, workstation, called  “Alto.” The futuristic Alto, was light-years ahead of its 1973 debut—bundled with a dynamic utility including: a mouse, graphical user interface and the connectivity of Ethernet. Interest in this revolutionary PC wonder kept expanding as countless demonstrations were given to the legions of intrigued individuals. The increasing demand for witnessing the power of PC computing was telegraphing the need for a new consumer market. For the first time, a “desktop sized computer”could match the capabilities of a full-service print shop. Advance technology always comes with a hefty price tag, and the Alto was no exception, making it beyond reach of most consumers. Despite a high price-point — excitement, fame and glory of Alto grew — as did admiration for the bold new world of Apple Computers and of its superstar founder — Steve Jobs.

Xerox Alto -1973 Was this the apple of Steve Job’s eye? It certainly was the first personal computer, which included most of the graphic interface features we recognize today.

Torch Of The Titans Lights New Horizons

By 1979, Apple was beginning to advance its own flavor of user-friendly interfaces with the development of the Lisa and Macintosh personal computers. Both products featured screens with multiple fonts, using bitmap screens for blending graphics and text. From early on, there were Apple graphic engineers associated with Xerox PARC — either through former employment or in connection with Stanford University. Apple engineers aware of advances made in graphic interfaces with PARC’s ALTO, prompted Steve Jobs to have a parlay with PARC. In late 1979, Steve Jobs with his Apple engineering entourage arrived to view an AlTO demonstration at Xerox facilities. The meeting’s outcome proved Jobs’ was a master of showmanship and marketing JudeJitsu by not disclosing a previously negotiated, sizable investment from Xerox’s venture capital group.

Gravitational forces began shifting in favor of Steve Jobs and Apple Computer to capitalize on the market potential for personal computing. PARC computer engineers and scientist clearly understood the economic potential of an information business they help to build… but top Xerox executives certainly did not.  Xerox had a history of dominating the lucrative copy machine market — this was the business model Xerox corporate decision makers were comfortable with and they would not risk venturing very far from.

Most of PARC’s personal computing developments experienced the same frustrating fate of being cherry picked by others —  allowing for lucrative opportunities to go for bargain rates to new companies like Apple Computers. Apple’s alchemy of — perfect timing, creative talent and visionary insight quickly aligned towards harnessing information technology products for an emerging market convergence. The creative inspiration and marketing savvy, which Steve Jobs’ applied towards personal computing—created  seismic ripple effects, which we’re still experiencing today.

.

Nothing Ventured, Nothing Gained  

Recently, there’s been a handful of media and tech industry critics, siting undeserved shortcomings of Bell Labs and Xerox PARC. Too often, corporate R&D labs are faulted for not fully marketing their technology developments or capitalizing on scientific inventions. Rarely mentioned in these over-simplistic reviews, is an understanding an R&D’s purpose or mission of innovation, which is directed by the parent company’s strategic goals. Failing to understand the reality of this relationship, detracts from the technological importance and diminishes the accomplishments of these remarkable engineers and scientists. Lost in the critics hindsight is an under-reporting of the titanic obstacles facing the marketing, manufacturing and distribution of innovative products.

.

Thrilling technical breakthroughs are what grab headlines — rarely are the successful efforts of corporate marketing or brilliant production logistics recognized or mentioned.  It’s a disconnect to judge a R&D’ lab’s success completely on the financial returns of its inventions.

The laser printer in particular, removes the myth that Xerox PARC mismanaged all of its developments. Gary Starkweather, a brilliant optical engineer for Xerox PARC, developed the laser printer. Starkweather had pitched battles with Xerox management over promoting the laser printer, but eventually he triumphed and the laser printer went on to earn billions of dollars — enough to repay the investment cost of Xerox PARC several times over. Eventually Starkweather moved on to greater opportunities when Steve Jobs offered him a job in Cupertino.

Brilliant R&D technology, requires an equally creative or open-minded group of executives for  converting technology innovation into a marketable product.  These decision makers must maintain iron-wills and courage to shepherd the technology product through its entire volatile development process.

IBM’s iconic 305 RAMAC, the first commercial ‘super computer,’  is a classic example of a product development challenge. Introduced in 1956, the RAMAC featured a hard disk drive (HDD) and stored a — whopping five megabytes of data. Apparently, the HDD storage capacity could’ve been expanded well beyond the 5MB, but was not attempted because — IBM’s marketing department didn’t believe they could sell a computer with more storage.                   

IBM 305 RAMAC — first commercial computer to use a hard disk drive in 1956.

R&D Labs take creative risk in developing new ideas, most of these developments won’t make it to market, but that’s the price of creativity. Using intuition for taking risks and knowing some failure is necessary to pave the road toward successful discoveries — builds confidence in trusting one’s creative resources. So often, the creative-process is misunderstood and undervalued in our society’s perceived need for instant control and results. In the past, I’ve personally witnessed this attitude reflected in our educational system, however the viewpoint is  progressively shifting to realize the value of the creative-process. Steve Jobs and Apple Computers are a good illustration of a company, which traditionally emphasized and embraced the creative spirit. Creative employees are considered the most valued resource at Apple as they are encouraged to nurture their creative uniqueness. Shortsighted emphasis on quarterly results, which has affected most of American business culture, is refreshingly absent from Apple’s overall mindset, allowing for more sustained and successful business initiatives.

Where Have All The R&D Labs Gone — Innovation Versus Invention

The era of industrial, ‘closed inventive’ research & development labs — have faded into the background of yesterday’s business culture. Internal silos, once the proprietary norm, have been day-lighted to allow fresh ideas and collaborative efforts to circulate.

For the past 10 years, corporations have steadily reversed their long-term, pure scientific research in favor of  efforts towards quicker commercial returns. In 2011, Intel Corporation, dropped its  ’boutique’ research lablets‘ in Seattle, Berkeley and Pittsburgh  — opting for academic research to be conducted at university facilities. Intel continues to maintain its more profit oriented Intel Labs. This industry strategy, repeatedly cloned itself within the corporate research world, as it is far easier to realize a profit from innovation than pure invention.

Perhaps the golden-age of great research & development labs have run their course — but not before replacing the analogue, industrial era technology with a digital one. A century ago, using creative, innovative and bold scientific vision, Bell Labs set the standard for future R & D Labs. Xerox PARC, helped to extend Bell Labs’ marvelous inventions and innovations with a solid platform of creative research for developing mass markets in the postmodern telecommunications and personal computing of today.  ~

 

  Pause & Reflect: Questions for continuous learning – part 2. 1.) Name the parent company (based in New York) featured in the essay and its research and development lab, which moved into California’s Bay Area? 2.) What was the profitable product (used for duplicating documents), that  this company had originally been built on? 3.) Give at least two reasons why this R&D lab was so inventive? 4.) What stopped the lab’s parent company from realizing more profits from its inventions? 5.) What was the name of  both the young, iconic tech entrepreneur and his company (named after a red fruit), who was able to creatively package and market early Silicone Valley PC innovations? 6.) What’s the difference between invention and innovation? 7.) In your opinion, who were the top 10 inventors of all time and how did they make your top 10?

.

.

References & Links    

wp- CREATIVE COMMUNITIES v5.indd
Bell Labs – Wikipedia, the free encyclopedia
Bell Labs
Telstar 1: The Little Satellite That Created the Modern World 50 Years Ago | Wired Science | Wired.com
Was Bell Labs Overrated? – Forbes
Top 10 Greatest Inventors in History | Top 10 Lists | TopTenz.net
History of Lucent Technologies Inc. – FundingUniverse
Volatile and Decentralized: The death of Intel Labs and what it means for industrial research
Inventive America | World | Times Crest
Bell Labs Kills Fundamental Physics Research | Gadget Lab | Wired.com
http://www.westernelectric.com/history/WEandBellSystemBook.pdf
HistoryLink.org- the Free Online Encyclopedia of Washington State History
Xerox PARC, Apple, and the Creation of the Mouse : The New Yorker
1956 Hard Disk Drive – Disk Storage Unit for 305 RAMAC Computer
IBM 305 RAMAC: The Grandaddy of Modern Hard Drives
WSJ mangles history to argue government didn’t launch the Internet | Ars Technica
A History of Silicon Valley

.

Boeing’s 787 Dreamliner Historic First Flight From Paine Field, Everett, WA.

Multimedia and video essay by: David Johanson Vasquez © All Rights

The presentation includes: Video of a 787 Dreamliner first flight, aerospace structural testing practices, aerospace engineering design practices, aerospace manufacturing, fiber composite materials.  

My video camera kit had been prepared months in advance, ready at a moment’s notice for the first maiden flight of Boeing’s 787 Dreamliner—21st Century entry airliner.  Finally, Dave Waggoner, the director of Paine Field Airport, queued me into the date to witness an evolutionary advance in commercial aviation.

Cameras Packed And Ready To Go

My home is only a short drive from Boeing’s production facilities at Paine Field, Everett; so I was motivated to video record this “making of 21st century aviation history.”  Due to initial production delays, an entire year went by before I received reliable news of the 787-8 wide-body, long-range airliner was ready for her much-anticipated maiden flight. The 787 Dreamliner’s first flight was at 10:27 a.m. PST, December 15, 2009.

Experienced As A Boeing Scientific Photographer

The 787, first flight video project brought back some great memories from my former career as an aerospace photographer with the Boeing Company.  When first hired on by the iconic aviation leader, my assignment involved providing video support for the Everett plant’s test engineering groups, who were conducting bulkhead fatigue test on airline fuselages. In preceding years, some airlines began experiencing inflight catastrophic failures related to metal fatigue. Tragically  the determined cause was from the age of the aircraft, specifically, stresses created when interior cabins went through an excessive number of pressurization cycles.

BOE 747 skin_BP_Pbgl747

An event in the 1980s, of a Boeing 737 was dramatically documented as it safely landed with a massive section of the fuselage missing. The Aloha Airlines, 737 jetliner experienced a catastrophic failure due to metal fatigue. The metal fatigue issues caused from pressurization cycles on the aircraft were not clearly understood, so the FAA required engineering test to research the potential safety threat.A series of highly documented Test were conducted over a period of months; going through thousands of pressurized cycles.  The purpose was to recreate what a jet airliner physically experiences when the cabin is repeatedly pressured and unpressurized — as in every-time an airliner takes-off, gains altitude and eventually returns for its landing. Our team of scientific photographers had series of video cameras, strategically placed within the test bulkhead, which sat shrouded in layers of protective coatings, in a remote section of the Everett facilities. Over-pressurizing the bulkhead eventually caused the anticipated failure, announced  by a thunderous sound of cracking metal. The  bulkhead  test was well documented using various engineering test methods and imaging equipment. Valuable test data gathered was immediately analyzed, studied and put to methodical use for redesigning, engineering and manufacturing safer jet airlines.

Examining a fuselage section of the 787 which uses composite carbon fiber materials.

Boeing’s Traditional Practice Of Over-Engineering

It’s been my experience, which confirms for me, what commercial pilots and engineers claim regarding Boeing’s reputation with its conservative practice of “over-engineering” their aircraft.  Historically, an over-engineering approach has proven itself as a life saving benefit — with countless Boeing aircraft surviving horrific damage… yet, still landing safely. Documentaries on WWII aircraft feature  shot-up Boeing aircraft returning safely, is an example of over-engineering.

For teams performing test  monitoring, with elaborate configured structures,  attached string gauges and actuators trying to force a break of an airplane part — the aerospace test may go on for days, or even months — the experience feels like sitting in bleachers for hours while watching slow-motion glacier races in progress.  All the invested resources of  time and effort, which goes into these aerospace component test,  helps to assure the flying public’s safety and the airlines performance records.

Engineers enjoy seeing how much torturous abuse their designed support systems will take before they bend, crack or break.  At the instant  a component does finally fail [normally, after far exceeding the range of what the it was designed to do] you’ll hear a loud noise caused from a test-object going beyond its limit. The sound of a breaking part, ends the tension of monitoring a test for hours or days — in an instant, the group of test engineers and technicians start cheering like a goal was scored by a home team in a stadium full of their fans.

Boeing 787-8 Dreamliner taxiing for its historic, maiden flight on December 15, 2009 from Paine Field Airport, Everett, WA.

Carbon Fiber Future In Aviation

One of many significant technological improvements for the new long-range, wide-body 787 Dreamliner, is a high percentage of composite, carbon fiber materials used in its construction. The amount of composite, materials employed in today’s aircraft have substantially increased from when it was initially developed  and used in military aircraft.  I recall, how amazingly light wing spares made of carbon fiber composite materials are, when moving them under lighting setups at Boeing’s Gateway studio.  It was fascinating observing and photographing the manufacturing of composite materials, as the process involves using massive heated autoclaves to form predesigned sections for aircraft structures.Now, remember the bulkhead test from a previous paragraph?  Carbon fiber composites eliminates the issue of metal fatigue associated with pressurizing  passenger cabin space.  Less concerns over metal fatigue allows for more pressurization  in the cabin for passenger comfort  — more importantly, the  integrated use of composite materials ensures greater safety, with substantially less risk to the structural integrity of the airliner.

Is Boeing’s Reliance On Outsourcing The Main Culprit For The 787 Dreamliner Being Grounded In A Global Lockdown?

In the past 15 years, Boeing’s upper management has broken formation from its traditional engineering leadership and replaced it by promoting executives with business and marketing backgrounds. The current Boeing regime embraces an outsourcing strategy, unfortunately, this trend of maximizing profits for shareholders has been on going with U.S. companies for the past two decades. Negative consequences of replacing an engineering management with a business one is clearly apparent in the power transmission industry — deregulation & marketing-driven-management  in the electric power industry has significantly placed this essential infrastructure at risk [overstretched power grid, vulnerable outdated high-power transformers.] Please see my multimedia essay – Will the Last People Remaining In America, Turn the Lights Back On? :http://sciencetechtablet.wordpress.com/tag/solar-storm-testimony-to-u-s-senate/                                           

                                                 Money_int _BPP_a223                                                                                                                                                 

A heavy dependence  on  foreign outsourcing is cited as a cause for unforeseen 787 production delays. Consistent, quality control monitoring becomes problematic when components are manufactured offsite, as result these issues can sometimes lead to extended,  unanticipated problems.photo illustration

Outside vendors are capable of producing equal, if not superior quality components to that of Boeing in some technical areas. In fact, there are legions of aerospace companies in the Puget Sound region, which supply critical parts to the 787 Dreamliner’s manufacturer. Some outsourcing is absolutely necessary for Boeing to compete with Airbus. The concern is outsourcing critical components in a new airplane program, which is attempting to use technology never used in a commercial airliner. It’s ironic, li-ion batteries are at the center of the 787’s grounding — lithium batteries have been a concern for over a decade to the FAA, TSA & NTSB, even leading to bans & restrictions for passenger’s to bring on commercial flights. It’s almost hubris or a form of high-risk gambling, to “initially” rely so heavily on outside vendors [GS Yuasa, the Japanese firm making the li-ion & Thales, the French corporation making the batteries’ control systems] for producing an unproven, prototype system. L PI CRTBD BPP et99

While working as a Boeing employee in the 1990s, I recall an incident with a vendor supplying thousands of counterfeit aircraft quality fasteners made in China. Fortunately, the fiasco was caught early — but not before many hours and dollars were lost, going back to inspect wings on the production line, to remove and replace the defective fasteners. Unless solid

photo illustration

 metrics are emplaced to assure critical standards are met for each component, it’s only a matter of time before a failure will occur. Boeing has traditionally been an aerospace company, which “over engineers” it airplanes & errors on the side of safety. Hopefully the company has maintained & continues to practice these quality assurances. Outsourcing is practical both economically and politically for companies with international sells. It’s a successful strategy Boeing has used for many years; outsourcing has proven to provide incentives for foreign airline companies to buy Boeing aircraft, in order to support their own domestic aerospace industries.            World_box_BPP_et424The American auto manufacture Tesla, had similar “thermal runaway” issues when first using li-ion batteries to power its Roadster. Tesla Motors, benefited from its learning curve by switching to Lithium Iron Phosphate batteries, which run at cooler temperatures. The innovative auto manufacture also developed its own battery pack architecture, with proprietary liquid cooling system packs — for controlling battery cell temperatures within self-contained, metal lined enclosures.  The nontoxic, Tesla battery packs are manufactured domestically in Northern California. Perhaps Boeing should be considering manufacturing all critical systems in-house and domestically as Tesla has done.  Boe_ing_747_stock_BPP_E221

According to MIT Technology review’s – Kevin Bullis, who points to Boeing’s battery manufacture,  GS Yuasa’s web site ,  the 787 is using Lithium Cobalt Oxide batteries, which it also manufactures for the International Space Station. These batteries are categorized  as “high-energy storage capacity,” but are not considered resistant to heat as other battery chemistry. Another issue I’m speculating could contribute to the  787 li-ion batteries overheating relates to Boeing reintroduction of an [electrical compressing system] to provide higher pressurization for the cabin environment. This type of cabin pressurization system requires more electrical energy than standard systems, so could this be putting additional demands on the batteries? Part of the advantage to using more composite materials in the 787 was to reduce metal fatigue caused from the cabin pressurization cycles. The Dreamliner uses higher cabin pressure than most aircraft to make it more comfortable for passengers — however, li-ion battery manufactures specifically warns against over-pressurizing these batteries. Is the cabin pressure contributing to pushing the li-ion beyond their tolerance?

Whether or not the stated technical issues are of a real concern for the onboard battery system packs, can only be determined by thorough testing.L TEC ELMICROS BPP et211

Again, it’s to early to know the exact extent of the problem with the 787’s battery systems. The issue will soon be isolated, as Boeing has long history of thoroughly testing and over-engineering its aircraft systems. One thing is certain, it’s rare for Boeing to experience a new aircraft being grounded simultaneously by  Japan’s transport ministry and by the FAA.

Ultimately,  A Bright Future Awaits The 787 Dreamliner

Gaining profitable fuel savings by developing a lighter, wide-body aircraft, combined with the fuel-efficient, GE or Rolls Royce engines, produces a major advance for airliner capabilities.  The tangible benefits in comfort, interior lighting and convenience  contribute to a remarkable passenger experience.  All the evolutionary, technical advances in the Boeing 787 Dreamliner, creates a remarkable new development  for commercial aviation. ~

Future of Flight Museum - Mount Rainier & Paine Field in background - Everett, WA

Future of Flight Museum – Mount Rainier & Paine Field in background – Everett, WA

Boeing 787 Dreamliner Maiden Flight – December 15, 2009 – Paine Field, Everett, WA.  Video by: David Johanson Vasquez © All Rights Reserved

Will The Current Solar Storms Hitting Earth, Lead To Lights-out for us by 2013-2014?

Essay and photos by: David Johanson Vasquez © All Rights

 Solar Storm forecast & updates are located above the essay’s first paragraph. These updates will be posted anytime a major solar disturbance is cited. Please read the essay first and return at anytime to view posted updates.

Joint USAF/NOAA Report of Solar and Geophysical Activity
SDF Number 197 Issued at 2200Z on 15 Jul 2012

IA.  Analysis of Solar Active Regions and Activity from  14/2100Z
to 15/2100Z:  Solar activity has been at low levels for the past 24
hours. Region 1520 (S17W48) remains the largest and most
magnetically complex region on the disk, however it has remained
rather stable and quiet. Regions 1521 (S21W60) and 1519 (S17W68) 
have been the most active regions producing low-level C-class
events. Both regions have shown moderate growth in sunspot area and
magnetic complexity. No Earth directed CMEs were observed during
the period.

IB.  Solar Activity Forecast:  Solar activity is expected to be at
low levels with a chance for M-class events for the next three days
(16-18 July).

Friday 13th, 2012— A massive X-Class Solar Flare, which occurred yesterday, is hurling  a coronal mass ejection (CME) towards Earth and will arrive approximately 5:17 A.M. EST according to NASA.  Several events involving this latest solar storm are unusual and are cause for concern: it’s the second massive X-Class (X is the most powerful class of Solar Flares) to take place within a week, the angle of the CME is pointed directly at Earth, potential sighting for the Northern Lights within the southern U.S., NOAA’s forecast is for a mild to moderate  geomagnetic storm on Earth, while NASA predicts a medium to severe storm to occur.

Earlier today, The Washington Post reported  of the conflicting geomagnetic forecast from the leading Federal agencies who monitor solar storms. Today’s events concerning solar storms are matching those cited in the featured February 2012 BPI  essay, indicating early warning of a destructive CME.

NOAA /   Prepared jointly by the U.S. Dept. of Commerce, NOAA,
 Space Weather Prediction Center and the U.S. Air Force. 3-day Solar-Geophysical Forecast issued Jul 08 22:00 UTC   http://www.swpc.noaa.gov/today.html

Solar Activity Forecast: Solar activity is expected to be moderate with a chance for X-class events for the next three days (09-11 July).

Geophysical Activity Forecast: The geomagnetic field is expected to be mostly quiet on day one (09 July). Quiet to unsettled conditions are expected on day two (10 July), with a chance for isolated active periods due to possible weak effects from the CME observed on 06 July. A return to mostly quiet conditions is expected for day three (11 July).

This year has seen a steady influx of news reports on increased solar storm activity hitting  Earth. Most broadcasts concerning this development is of a less serious kind, featuring its spectacular visual effects, which creates the unworldly, “Northern Lights” or “Aurora Borealis.”  However, a few reports have mentioned necessary cancellations of airline flights using trans-polar flight routes—due to the sun’s disruptive solar flares. Intense solar activity is nothing new, but a recurring event—which has taken place countless times before civilization ever existed on Earth. What’s of concern today is the 11-year peak cycle, of which the sun now is entering, resulting in extreme solar storm activity.  Some solar physicists predict the current cycle of storms may have greater magnitude than any before, including the record solar maximum, chronicled over 150 years ago, in the year of 1859.

Why should anyone care if the solar storm activity becomes more intense than any other time in recorded history?  Simply stated‑‑‑civilization as we know it, could be stopped in its tracks or altered to resemble something not recognizable.

Imagine not being able to turn on lights for illuminating your home or office—communication by phone, email and social media all gone, with no guarantees as to when it could or would be back online. There’s other more challenging issues regarding basic food production and distribution. The cited scenarios are extreme, but are possible consequences from a major solar storm. These intense solar disruptions are known as a “coronal mass ejection” (CME), which could knockout virtually any technology, requiring electricity.  This event could take away most of the technology we depend on and ironically transport our way of life back to the time when the last great CME hit.

If you had a window, which peered back-in-time to the end of August, 1859; you’d see a developing western society on track with an industrial revolution in full-motion.  Harnessing the new wonders of steam energy was nearly complete, however, electrical energy barely had reached its first phase of infancy.  Few applications for electricity existed, except for a remarkable one in the form of instant communication.  By sending electrical pulses through copper wires to a remote electromagnetic receiver, messages were transmitted instantly over great distances. The telegraph could be considered a 19th century equivalent of today’s Internet. This system used a basic, universal binary code developed primarily by the American artist, Samuel F.B. Mores.  By the mid 19th century, scientist demystified electricity’s secrets, and inventors found ways to harness it for communication using “direct current.”

As the summer heat of September approached the northern hemisphere: a series of solar storms increased with startling intensity; producing extreme Northern Lights, which appeared in unlikely places, such as the Caribbean near the equator.  Inhabitants reported in Northeastern America of using the intense Northern Lights to read newspapers with, during the dark hours of night.  Other stories mention groups of people being awakened by this strange, bright light and believing it was actually morning.  All over the World, compasses used for navigation (the rough equivalent of today’s GPS) were no longer giving accurate readings as the Earth’s geomagnetic forces were being distorted by the solar storms energy.

Sunspots were first documented by Galileo in the 17th century, these solar disturbances contribute to solar storms.

Sunspots on the sun’s surface, contributes to forming solar storms, of which Galileo had first observed in the 17th century and by 1745 solar flares were well documented.  Up until 1859, the solar storms only known effects on humans were in producing dazzling display of cosmic fireworks, located far into the northern and southern hemispheres.

The uninformed, industrial age public had no reason for concern as the peak of the solar storm began arriving on September 1st and 2nd.  These extreme, violent sun flares, hurled enormous magnetic clouds of plasma into space, known as a—coronal mass ejection (CME). This CME solar storm became known as the Carrington Event, named for a British astronomer, who first recognized and identified its geomagnetic effects on Earth.

Solar ejections normally take three to four days before reaching Earth, but this extreme burst had a hyper-velocity, which took less than 18-hours for the shock waves to compress the Earth’s protective magnetic field.

 As a surge of solar electromagnetic energy overpowered and broke through part of the Earth’s own protective magnetic field, alarming events began happening.  First, came a series of random, garbled telegraph signals being picked up—which mysteriously, had not been sent by an operator—then reports of telegraph receivers violently bursting into flames —setting secondary fires to office papers along with telegraph lines themselves. Jolts of electricity nearly electrocuted some operators while attempting to disconnect the system’s electrical batteries; even with their disconnection, frenetic signals continued out-of-control from massive energy overflows—the geomagnetic super-storm was sending dangerous charges of electricity through a vast network of copper lines. The geomagnetic storm caused by the sun, devastated an emerging communication infrastructure and severely set back its development.

This record solar storm event appeared on the scene, well before societies and industries realized electricity’s great potential—unlike today with electricity as an essential necessity in just about every part of the technology we use and take for granted today.

Until recently, I’ve always looked forward to the Northern Lights dazzling arrival. I recall my first  Aurora Borealis encounter shortly after graduating from college, while on a road trip to the Olympic Rain Forest. Camping out in the Olympic Mountains, the northern sky began glowing at twilight with vivid illuminating curtains moving until they were flashing directly overhead. I kept watching the surreal specters until they exited out of view an hour later.

The next time I viewed these mysterious lights happened on a photography assignment to the “North Slope” oil fields, located above Alaska’s arctic circle. The Earth’s natural magnetic field, which protects the planet from much of the sun’s solar radiation, is weakest near the Earth’s polar regions; allowing for solar winds to enter and interact with our atmosphere to create the Aurora—this is why the cosmic lights are viewed while looking north, in the northern hemisphere and the reverse for the southern hemisphere.  

Captivated by the up-close experience of the Aurora’s light; I endured the extreme outside temperature which was minus 40 degrees.  Facing frigid arctic weather, I photographed the light show, until the springs controlling my camera’s shutter began to freeze up.

Actually today’s digital cameras make it easier to photograph the northern lights.  Digital cameras, especially high-end, professional versions are much more low light-sensitive than film camera were and have a better tonal-dynamic-range.  My all-time-favorite Northern Lights experience was in Eastern Washington, where I was at a ranch in the Okanogan region.  This encounter was so full of effervescent bright light, it woke up birds from a night sleep as they began to take flight while making loud, chirping sounds as if dawn had arrived. In this environment, with no light-pollution from a city, while located at a 5,000 foot elevation, made for an ideal night-sky photography experience.

In 2003 was one of the greatest solar flare events in contemporary history —the Northern Lights were so intense, I easily photographed them from my home in Western Washington.   Despite the bright lights coming from a nearby city, they did not obscure the luminous Aurora Borealis view. The referenced photos of the Northern Lights were taken from my home, are featured in this essay.  In these images you can see the glowing transient—green, red and purple color produced, as the sun’s energy interacts with various gas elements which comprise the Earth’s atmosphere.

The reason for solar flare events to peak in 2013 or possibly in early 2014, is due to the sun’s magnetic field reversing polarity within an 11-year cycle.  It takes a full 22-years for the sun’s magnetic fields to return to their original pole positions, which then completes a full cycle. Apparently, near the 11-year cycle, which our sun has entered, the solar flare activity becomes more intense.

The 1859 record solar maxim was on one of these 11 years cycles. Another theory connected with returning mammoth CMEs is the high quantity of sunspots recorded over the past couple of decades.  Sunspots appear when portions of our star’s internal superheated matter, mixes with cooler regions above the surface; creating intense magnetic fields. These magnetic fields are swept up, and then forced below the surface, where they become recycled by the sun’s complex quantum mechanics.  Energy from sunspots becomes amplified, creating even more extreme magnetic fields as they resurface form a four or five-year subsurface journey.  These magnetic disturbance interact to create concentrated arcs of solar energy, which are so powerful they are ejected outward in the form of solar flares.

Other methods scientist use for estimating the potential scale of this year’s solar storms is to examine recent solar cycles—looking for progressive trends or patterns for their projections.

In 1989 a CME hit the Earth with intense energy particles, causing the electrical grid in Quebec, Canada, to crash, which plunged millions of people into darkness.  This event took place during the “cold war” and it caused severe shortwave radio disruptions with Aurora Borealis sightings in south Texas.  Some believed the disruption was the beginning of a Soviet nuclear first strike, using intense electromagnetic energy to disrupt communications and electric grid infrastructure.  In reality the blackout was caused by a CME, created from the  sun’s own nuclear energy.  Acting like a giant teetering domino, the event triggered a chain reaction, taking down interconnecting electric networks within a large region of North America—but even this event was not on a scale with the mega storm of 1859.  That’s why some scientist view the 30-year old, Hydro-Quebec solar storm as a telegraphed alarm warning.

With demand for power growing even faster than the grids themselves, modern networks are sprawling, interconnected, and stressed to the limit—a recipe for trouble, according to the National Academy of Sciences:The scale and speed of problems that could occur on [these modern grids] have the potential to impact the power system in ways not previously experienced.” There’s fear the expanded network of lines creates a bigger antenna enabling it channel a geomagnetic induced current (GIC.)  NASA has become alarmed with how much more vulnerable the North American power grid has become, it co-developed an experimental program called “Solar Shield” to help warn utilities of impending geomagnetic storms.

Since 1989 we have become much more dependent on microelectronics, with their intricate architecture of high density, compressed components.  Having unshielded microcircuits squeezed tightly together increases the odds of severe damaged caused from geomagnetically induced currents (GICs).  The 1989 solar storm event damage at least 30 satellites, some  of which were beyond repair.  Solar storms can easily scramble the intricate digital components of low-orbit satellites and disorient them from knowing which way is up or down.

In theory, with enough warning, orbiting satellites are safely switched off or pointed away from the sun’s destructive radiation.  Early warning satellites are now positioned at a L1 point, geostationary orbit to monitor solar storms and announce threatening CME activity.  Solar Shield Project is a collaboration between NASA Goddard Space Flight Center and Electric Power Research Institute (EPRI).  The purpose of this project is for establishing a forecasting system, which can be used to lessen the impact of geomagnetically induced current (GIC) on high-voltage power transmission systems. (Please see associated link bellow for more information.)

The Earths atmosphere and magnetic fields normally protects us from the harmful solar storm’s radiation.  Higher exposure to the sun’s powerful energy becomes a factor once you start climbing in elevation. Radiation exposure is a secondary reason why airlines must divert from their trans-polar routes, to avoid excessive exposure.

Disruption of GPS and radio communication from the solar storms is the primary reason for flight diversions.  Astronauts working above Earth’s protective atmosphere face the greatest risk from such effects caused from solar flares. These stellar storms have shortened or alter a number of space missions in the past. The Russian’s space station MIR in 1993 had an unfortunate encounter with a solar storm, exposing the cosmonauts to dangerous levels of over 10 times the normal allowable radiation limits.                              

What could be warning signs or likely indicators of an impending maxim solar disturbance?  So far, NASA and NOAA are the only government agencies I’m aware of who’s keeping the public informed with the most current status of solar flares.

At the end of this essay are links, which give important information on this year’s solar storms including: NASA and NOAA sites, which monitor hourly conditions. If solar storm activity becomes alarming, NASA will most likely be out front with the reports and major news networks will probably soon follow.  If a certain threshold of (x-rays) is reached within the first phase of a major solar storm, the FAA will order cancellations of airlines with trans-polar flights.  Disruption of shortwave radio communication is the earliest indicator of a severe storm.  If conditions become dire, all but emergency flights would be grounded indefinitely.

 - Image courtesy of NASA

– Image courtesy of NASA

If NASA issued orders to evacuate astronauts from the International Space Station (ISS), this would probably be a strong indicator the radiation levels from the second phase of a storm are severe. Supposedly the center of the Space Station has enough mass to offer some protection from this type of event, but NASA would probably play it safe and order emergency return flights, that is, if there was enough time. Seeing the Northern Lights close to the equator would be a strong indicator the Earth’s geomagnetic fields were being overrun, meaning the big one might be arriving.  If a major CME  (the particle phase of a storm) 752830main_iss036e002224_fullcomes our way, there may be 18 hours or less to prepare.  On the positive side, unlike a major earthquake or other natural disasters we at least would have some time to prepare and be ready to brace for a worst case scenario.

.

It would be an unfortunate irony if the sun made our world go dark, but here’s how it could happen. The National Academy of Science produced a 2008 report warning, if we had another major solar storm like the 1859 Carrington event, we would have extensive blackouts with the loss of key transformers.  Our Nation’s electrical utilities have in all total, less than 400 major transformers to supply all the power we use. There are no longer any companies within the U.S., which make massive sized transformers. If an extreme solar maxim arrives, we’ll probably be on a long waiting list (along with the rest of the world) for key replacements. Given enough time, these massive electrical components can be built domestically, but it could take years — a major obstacle and a catch-22 — transformers require huge amounts of electricity for their construction.

Even without a disaster happening, electric utilities face a minimum of two-years from when a major transformer (average cost 4 million dollars) is ordered and finally installed (according to a global, equipment insurance company.) Critical shortages of raw materials and trained workforce for transformer installation contribute to this problem. Hopefully the utility company supplying your community power, learned a lesson from the 1989 Hydro-Quebec blackout. There are preventive strategies to guard against geomagnetic induced current (GIC)—such as a “solid ground system;” which is an industry design to help protect electrical infrastructure from a nuclear induced: electromagnetic pulse (EMP.)

An EMP creates a tremendous amount of electromagnetic energy, similar in some ways to a naturally occurring solar storm CME.  The next best plan for the electric utilities will be to disconnect the power lines from any plant’s key equipment threatened by massive surges of electromagnetic energy.  Just disconnecting lines could prove ineffective if a surge was big enough. The  connecting leads to a transformer could possibly be used as an antenna for attracting the surge of electromagnetic energy.

There is something you can do to protect your own electrical devices from the devastating effects of either a solar CME or a nuclear EMP.  You can easily, with very little cost, build what is known as a Faraday cage to protect your equipment.  For instance for: a radio, cell phone or batteries (all of which are vulnerable to massive electrical surges;) you first wrap the devices in thick plastic like a freezer bag or bubble wrap, then use three layers of aluminum foil to completely wrap the devices so there are no gaps. The plastic acts as an insulator from the metal foil which intern deflects energy.

I’ve include a web link to an electrical engineer’s website who explains the procedures and others for protecting against Solar CMEs or EMPs. You can also do a google search for Faraday cage.  Unplugging your electrical equipment from outlets is a good safety precaution, which ordinarily could protect you against a lighting storms, but will probably not prevent your electronics from being fried from a major CME.  If you remembered what happen to the telegraph system, which was hit by the largest CME in history in 1859, the electromagnetic energy used the unconnected wires from the telegraph as an antenna to channel its force through. Tesla, the great Hungarian born inventor who championed AC electrical power, proved electrical transmission could efficiently be sent through air without using power lines.

One other critical infrastructure which could be devastated from an CME or EMP is major pipelines.  The metal in power-lines an pipelines is a great conductor for geomagnetic energy. Testing has shown electromagnetic surges can effect the controls for monitoring pressure and flow of buried high-pressure pipelines. In Russia, it was found past solar storms have caused severe corrosion effects on some of its pipeline.  Apparently, the corrosion effects is not as much of an issue in the North America because the pipes are manufactured using a more advanced process.

For most civil preparedness involving impending emergencies, it’s best to listen to experts who advise: always have enough: food, water and flashlights on hand to survive what happens after a major natural disaster event occurs.  A good plan for how to keep in contact with family members will be critical if a major solar storm event occurs; especially with an extreme maxim CME, as communication equipment will be toast unless it was properly shielded from the event. Self-reliance is a good policy to help sustain individuals and families from the effects due to a major solar storm or catastrophe. Most  common-sense preparations mentioned in this essay are basics ones every family should have in-place, in case of an earthquake or any major disaster occurrence.

Will a decimating solar storm hit in 2013 or 2014?  No one can forecast for certain how severe this solar maxim will or will not be—however, if there’s enough strength behind the solar storm and its path becomes directly aimed towards Earth, then it could be the greatest challenge civilization has ever faced. Learning from the lessons of history has been an essential part of the human experience—we successfully thrive in the moment by learning from histories past events. This seems so obvious for self-preservation, but it involves a fine-tuned balancing process—between what we carefully choose to forget of painful tragedies, versus remembering our own inspirational triumphs. Ideally, the value of any-type of learning, produces confidence and preparedness for future encounters, situations and events.

Given a solar CMEs disruptive potential, it’s in everyone’s self-interest to judge the potential risk; then have an action-plan to help lessen the life-altering impact from an extreme-act-of-nature.  Personally, I don’t sense any impending doom with this year’s solar maxim.  By doing basic research, to become educated on solar events, I gained knowledge on the potential for some disruption to our infrastructure. With informed awareness, I’m confident I’ve taken the necessary precautions for my family to best be ready for this and any future natural disasters, which may arrive from over the horizon. ~

The Aurora Borealis or Northern Lights have been revered and feared by ancient and prehistoric cultures. The phenomena are created from solar winds colliding and interacting with Earth’s atmosphere

Bellow are useful links related to the subject solar storms including official government agencies including: NASA and NOAA.  Other sites and articles include those from: National Geographic, Washington Post and Christian Science Monitor.

You’re encouraged to click on the links below to learn more about solar storms. ↓

A most beautiful video time-lapse of the Aurora Borealis  http://vimeo.com/11407018

http://www.swpc.noaa.gov/

http:science.nasa.gov/science-news/science-at-nasa/2003/23oct_superstorm/

Solar Shield Project is a collaborative project between NASA Goddard Space Flight Center and Electric Power Research Institute (EPRI).  http://ccmc.gsfc.nasa.govAn electrical engineer, who gives great information on how to protect your electrical components from EMP blast, produces this site. He also offers an expert opinion of what to expect will happen to our Nation’s electrical grid, if such an event occurs. http://www.futurescience.com/emp/emp-protection.html

http://news.nationalgeographic.com/news/2011/03/110302-solar-flares-sun-storms-earth-danger-carrington-event-science/

http://www.csmonitor.com/Science/Cool-Astronomy/2010/0809/Could-a-solar-storm-send-us-back-to-the-Stone-Age

http://www.flixxy.com/solar-storm-1859.html

http://news.nationalgeographic.com/2012/03/120308-solar-flare-storm-sun-space-weather-science-aurora/

auroras-flights-sun-earth-space-science

[contact-form] [contact-field label="Name" type="name" required="true"/] [contact-field label="Email" type="email" required="true"/] [contact-field label="Website" type="url"/] [contact-field label="Comment" class="GINGER_SOFATWARE_noSuggestion GINGER_SOFATWARE_correct">textarea</span>" required="true"/] [contact-field label="New Field" type="text"/] [/contact-form]