Wednesday, December 17, 2014

Homeland Security and Public Safety : FBI Beefs Up Amid Explosion of Cybercrime

The head of the FBI said cybercrime is "exploding" apace as the influence of the Internet rises meteorically.

FBI Director James Comey
FBI Director James Comey takes questions from members of the media during a news conference on Nov. 18, 2014, in Boston. (AP Photo/Steven Senne)

(TNS) — The head of the FBI said cybercrime is "exploding" apace as the influence of the Internet rises meteorically.

"It (the Internet) is transforming human relationships in ways we've never seen in human history before," FBI Director James Comey said Friday.

Comey said he sees a "tremendous amount of cyberespionage going on — the Chinese being prominent among them, looking to steal our intellectual property."

"I see a whole lot of hacktivists, I see a whole lot of international criminal gangs, very sophisticated thieves," he said. "I see people hurting kids, tons of pedophiles, an explosion of child pornography."

Cybercrime is one of the priorities for the FBI, which has 13,260 special agents across the country, including on Oahu, Maui and Hawaii island, according to the agency. The FBI had an $8.3 billion budget in fiscal 2014.

Comey, the head of the FBI for 15 months, met with law enforcement officials on Oahu on Friday as part of his effort to visit each of the FBI's 56 field offices around the country by the end of the year.

The Honolulu office in Kapolei was No. 56, he said.

Comey met with U.S. Attorney Florence Nakakuni and county police chiefs and introduced Paul Delacourt as the new special agent in charge of the FBI's Honolulu Division.

Comey was asked at the press event about former Oahu resident and NSA whistleblower Edward Snowden and other intelligence breaches in Hawaii in recent years.

He said that foreign intelligence-gathering in Hawaii is "significant."

"Let me say it this way: Our counterintelligence program is one of the most important parts of the FBI and our partnerships here," Comey said. "This (Hawaii) is where a huge part of the nation's military, civilian and intelligence infrastructure is, so it's where foreign nation-states are going to come if they want to steal stuff from us."

Comey said he wouldn't characterize the threat of foreign intelligence-gathering as increasing, but "it is significant and it has remained significant."

As for Snowden, who lives in exile in Russia, Comey said he would "welcome the opportunity to afford him the rights and privileges attendant to anybody who is a defendant" in the U.S. criminal justice system.

The Kapolei field office has about 200 people, including agents and analysts, and those working in computer services and administration, the FBI said.

The agency said it does not reveal how many agents there are within that total.

Comey said he talked with the agents in Hawaii and his law enforcement partners "about ways to work better together and to see if we can't get more technology and more talent here to focus on the cyber threat."

Agency-wide, the FBI is "doing a lot of hiring" to combat cybercrime, Comey said.

"We've hired 100 more computer scientists," he said. "I'm investing in high-speed accesses and all kinds of equipment."

He added, "Congress has given us the resources, because Congress sees the threat."

In an October talk at the Brookings Institution, Comey raised concern about real-time and stored data, including phone calls, live chat sessions and email text messages, that are increasingly being encrypted.

"We call it ‘going dark,' and what it means is this: Those charged with protecting our people aren't always able to access the evidence that we need to prosecute crime and prevent terrorism even with lawful authority," Comey said at the time.

The discussion comes as Snowden's government eavesdropping revelations created a backlash against those far-reaching efforts.

Comey said Friday it's important for the nation to "have a conversation" about giving law enforcement the ability to access real-time and stored data.

"I'm a big fan of privacy," he said. "I don't want anybody rifling through my stuff. But if there's probable cause to believe that the evidence of a serious crime is contained on a device, we need to be able to get access to it."

©2014 The Honolulu Star-Advertiser. Distributed by Tribune Content Agency, LLC.
 www.emergencymgmt.com 

Disaster Preparedness & Recovery : Hurricane Forecasters Will Issue Storm Surge Graphics in 2015

The graphic will be experimental for at least two years, while the government gets real-time experience and feedback from the media and public.

Storm surge graphic
An example of the graphic that the National Hurricane Center will begin issuing for storm surges in 2015.(National Hurricane Center)
(TNS) — As promised, starting next season, hurricane forecasters will begin issuing watch and warning graphics not just for hurricanes and their potential paths, but also for storm surge, by far the biggest killer, more so than rain-driven flooding or high winds.

Surges often can strike well before or after landfall, and sometimes far from the point of impact; sometimes outside even the storm’s wind field.

“While most coastal residents can remain in their homes and be safe from a tropical cyclone’s winds, evacuations are generally needed to keep people safe from storm surge. Having separate warnings for these two hazards should provide emergency managers, the media, and the general public better guidance on the hazards they face when tropical cyclones threaten,” the National Oceanic and Atmospheric Administration said Thursday in a press release.

The graphic will be experimental for at least two years, while the government gets real-time experience and feedback from the media and public.

Last year, the National Hurricane Center began using an experimental potential storm surge flooding map. It plans to combine usual storm watches and warnings with storm surge advisories starting in 2016.


©2014 The Palm Beach Post (West Palm Beach, Fla.). Distributed by Tribune Content Agency, LLC.
 www.emergencymgmt.com 

Disaster Preparedness & Recovery : White House Launches Open Data Disaster Portal

More than 100 data sets can be found at disasters.data.gov along with tools, contests and resources to help first responders and technologists engage with a larger community.

Disasters.data.gov
Disasters.data.gov was designed to foster collaboration and the continual improvement of disaster-related open data and tools.
The White House launched a new open data portal on Dec. 15, targeting the needs of first responders and emergency survivors. The website, found at disasters.data.gov, features disaster-related data sets, tools and resources for those who want to join a larger community of data-minded first responders.

The website features data (more than 100 sets) that can be sorted by type of disaster — earthquakes, floods, hurricanes, severe winter weather, tornadoes and wildfires — as well as tools and apps, and information about how to get involvedor join challenges like the current Innovator Challenge, which calls for ideas on how to reduce flooding fatalities.

The Innovator Challenge listed as of the date of this report is the first in a series that are intended to highlight the needs of the disaster preparedness community.

The website was born from the White House Innovation for Disaster Response and Recovery Initiative, a project launched in response to Hurricane Sandy that aims to turn technology into empowering tools that can save lives. The initiative has manifested in the form of a hardware hackathon for disaster preparedness and partnerships with both the private and public sectors, but this portal represents the first unified online resource for such efforts.
www.emergencymgmt.com 

Monday, December 15, 2014

Border Security : Drones watch over U.S. borders

Published 17 November 2014
Since 2000, the number of Border Patrol agents on the 1,954-mile U.S.-Mexico border has more than doubled, to surpass 18,000, and fencing has increased nine times — to 700 miles. Some members of Congress and border state lawmakers are calling for more border agents and more fencing, but the Obama administration is looking to drones to help reduce the number of illegal immigrants and drugs entering the United States, while simultaneously shifting resources and agents to parts of the border where illegal activity is highest.
CBP drone waiting on the tarmac // Source: eanlibya.com
Since 2000, the number of Border Patrol agents on the 1,954-mile U.S.-Mexico border has more than doubled, to surpass 18,000, and fencing has increased nine times — to 700 miles. Some members of Congress and border state lawmakers are calling for more border agents and more fencing, but the Obama administration is looking to drones to help reduce the number of illegal immigrants and drugs entering the United States, while simultaneously shifting resources and agents to parts of the border where illegal activity is highest.
You have finite resources,” said Customs and Border Protection (CBP) commissioner, R. Gil Kerlikowske. “If you can look at some very rugged terrain (and) you can see there’s not traffic, whether it’s tire tracks or clothing being abandoned or anything else, you want to deploy your resources to where you have a greater risk, a greater threat.”
The Christian Science Monitor reports that since March 2013, the U.S.government has operated about 10,000 drone flights, covering 900 miles along the border, mostly over remote mountains, canyons, and rivers. The drones fly out of the U.S. Army Intelligence Center at Fort Huachuca in Sierra Vista, or Corpus Christi, Texas. They operate at altitude of 19,000 to 28,000 feet, and between twenty-five and sixty miles of the border.
Nearly half the U.S.-Mexico border is now patrolled by Predator B drones with high-resolution video cameras that send footage to analysts who then identify small changes in the landscape — the tracks of a farmer or livestock, or those of illegal immigrants or vehicles used to smuggle drugs into the country. Ninety-two percent of drone missions have showed no change in terrain, but 8 percent have raised enough concerns to dispatch agents to review the changes. Four percent of the reviews have been false alarms, likely the tracks of farmers or cows, and 2 percent are inconclusive. The remaining 2 percent show evidence of illegal crossings from Mexico, which then leads to closer monitor via ground sensors.
Representative Michael McCaul (R-Texas), who chairs the House Homeland Security Committee, believes the drone approach is proactive at a time when “we can no longer focus only on static defenses such as fences and fixed (camera) towers.” Senator Bob Corker (R-Tennessee), who co-authored a 2013 legislation to add 20,000 Border Patrol agents and 350 miles of fencing to the southwest border, said, “if there are better ways of ensuring the border is secure, I am certainly open to considering those options.”
The drone program is expected to expand to the Canadian border by the end of 2015.
http://www.homelandsecuritynewswire.com/

Detection : The science of airport bomb detection: chromatography

By Martin Boland
Published 12 December 2014
As the holidays draw near, many of us will hop on a plane to visit friends and family — or just get away from it all. Some will be subjected to a swab at the airport to test clothes and baggage for explosives. So how does this process work? The answer is chromatography — a branch of separation chemistry — along with mass spectrometry. Although instrumental chromatography is a mature technology (the first instruments were produced just after WWII), new applications frequently pop up. Some are a matter of scale. Pharmaceutical companies that produce monoclonal antibodies (often used in cancer treatments) make use of capture chromatography to purify their products. On an industrial scale these can be tens of centimeters in diameter and meters in length (typical lab scale systems are a few millimeters diameter and 5-30cm long). Other uses can either be in a specific new application, such as detecting cocaine on bank notes using the gas chromatography systems often seen at airports as bomb and drug detectors.
As the holidays draw near, many of us will hop on a plane to visit friends and family — or just get away from it all. Some will be subjected to a swab at the airport to test clothes and baggage for explosives. So how does this process work?
The answer is chromatography — a branch of separation chemistry — along with mass spectrometry (which I will address in a later article).
The word “chromatography” is roughly translated from Greek as “the science of colors.” The reason for the name becomes obvious when you realize that most people have accidentally performed a simple chromatography experiment.
If you’ve ever spilled water onto a hand-written shopping list, then held it up to let the water run-off, you’ve probably noticed the ink diffuses across the paper, and that the pen’s color is made up from several pigments (if you’ve not, you can do the experiment — try it with a couple of pens of different brands, but the same color). This separation is chromatography.
There are several different types of chromatographic separation. What they all have in common is that a mixture of materials that need to be separated (the analytes) is washed over a solid material (called the matrix), causing the analytes to separate.
That may sound like chromatography is just filtration, or separation by particle size. In some cases, that is almost exactly what happens (size exclusion chromatography is often referred to as gel filtration chromatography).
But most chromatography methods work by some other chemical effect than just the size of the materials being separated, including (but not limited to):
  • normal-phase chromatography, such as ink on paper
  • reverse-phase chromatography, often used in university lab experiments
  • gas chromatography, seen in airport bomb detectors
  • capture” chromatography, used to purify drugs.
Each of these can be performed with one solvent, such as dropping water on your shopping list – known as isocratic (Greek for “equal power”) or with a changing mixture of solvents (known as a gradient).
So how does it work?
Technically speaking, it is the differential affinity of the analyte for the solvent and the solid matrix that drives chromatographic separation. So what does that mean, really?
You’ll need to bear with me here.
Have you ever been shopping with someone who stops to look at things while you’re trying to move though the store as quickly as possible?
That differential attraction to the stuff surrounding you — that’s what drives chromatography. You walk through the aisles only rarely interacting with the goods on sale, while your shopping partner has much greater affinity for the shelves and stops frequently. By the time you’re at the exit they are still only halfway through the shop — you’ve separated!
That is what happens to molecules. The solvent flows over the matrix (in the shopping list case, the paper) carrying the analytes. The relative affinity of the analyte for the matrix compared with the solvent determines the separation.
If a compound is totally insoluble in the solvent, it stays fixed to the matrix (you may have seen this when spilling water on a shopping list written in pencil). If the analyte is very soluble, it may move as fast as the solvent.
The shopping list example is called planar chromatography. The running ink seems to defy gravity, moving up the paper due to the capillary effect. More common in high-performance chromatography, the matrix is a column with the solvent forced over it, by gravity or pumping.
Using a column makes it easier to change the ratio of solvents by using a pump that can mix multiple materials (usually a mixture of water and a soluble organic solvent such as acetonitrile).
In the case of a gradient separation, the analyte has much higher affinity for the matrix than for the initial solvent mixture. As the solvent mix is changed, the analyte dissolves in the solvent and is carried out of the column separated from materials that are soluble in different solvent ratios.
Sometimes it’s a gas, gas, gas
For gas chromatography, the set-up is a little different. The analytes are gases or volatile liquids (think petrochemicals, plant oils, chemical weapons). Such compounds are usually non-polar and hydrophobic – in other words, they don’t mix well with water.
The compounds are evaporated into an inert carrier gas (analogous to dissolving in a solvent). The carrier gas transports the compound over a hydrophobic matrix contained in a coiled column (often tens of meters long but only micrometers wide).
To improve separation, and allow analysis of materials with a higher boiling point (up to around 300C), the column is placed in an oven. Changing the temperature of the oven affects separation in a similar way to changing the mixture of solvents in liquid chromatography.
Quality control
When separating colored compounds it’s pretty obvious when the process has worked. But how do you know if you’ve separated two colorless compounds, or separated microscopic amounts of analyte?
There are several ways to detect the analytes depending on their chemical and/ or physical properties. Among the more common are:
  • ultraviolet or infrared (non-visible but optical wavelength) absorbance
  • non-visible fluorescence
  • conductivity or pH (how acidic the solution is)
  • collect samples and perform chemical tests
  • mass spectrometry.
Probably the most useful of these is mass spectrometry as it allows the analyst to work out exactly what compound they are seeing without needing prior knowledge of what was in the original analyte mixture.
An ever-developing world
Although instrumental chromatography is a mature technology (the first instruments were produced just after WWII), new applications frequently pop up.
Some are a matter of scale. Pharmaceutical companies that produce monoclonal antibodies (often used in cancer treatments) make use of capture chromatography to purify their products. On an industrial scale these can be tens of centimeters in diameter and meters in length (typical lab scale systems are a few millimeters diameter and 5-30cm long).
Other uses can either be in a specific new application, such as detecting cocaine on bank notes using the gas chromatography systems often seen at airports as bomb and drug detectors.
And even more exciting experiments are being done by chromatography instruments on board the Philae probe that detected organic chemicals on the comet 67P/Churyumov–Gerasimenko.
Martin Boland is Senior Lecturer of Medicinal and Pharmaceutical Chemistry at Charles Darwin University. This story is published courtesy of The Conversation (under Creative Commons-Attribution/No derivatives).
www.homelandsecuritynewswire.com 

Cybersecurity : Can a hacker stop your car or your heart? Security and the Internet of Things

Published 15 December 2014
By Temitope Oluwafemi 
An ever-increasing number of our consumer electronics is Internet-connected. We’re living at the dawn of the age of the Internet of Things. Appliances ranging from light switches and door locks, to cars and medical devices boast connectivity in addition to basic functionality. The convenience can’t be beat, but the security and privacy implications cannot and should not be ignored. There needs to be a concerted effort to improve security of future devices. Researchers, manufacturers and end users need to be aware that privacy, health and safety can be compromised by increased connectivity. Benefits in convenience must be balanced with security and privacy costs as the Internet of Things continues to infiltrate our personal spaces.
An ever-increasing number of our consumer electronics is Internet-connected. We’re living at the dawn of the age of the Internet of Things. Appliances ranging from light switches and door locks, to cars and medical devices boast connectivity in addition to basic functionality.
The convenience can’t be beat. But what are the security and privacy implications? Is a patient implanted with a remotely controllable pacemaker at risk for security compromise? Vice President Dick Cheney’s doctors worried enough about an assassination attempt via implant that theydisabled his defibrillator’s wireless capability. Should we expect capital crimes via hacked Internet-enabled devices? Could hackers mount large-scale terrorist attacks? Our research suggests these scenarios are within reason.
Your car, out of your control
Modern cars are one of the most connected products consumers interact with today. Many of a vehicle’s fundamental building blocks – including the engine and brake control modules – are now electronically controlled. Newer cars also support long-range wireless connections via cellular network and Wi-Fi. But hi-tech definitely doesn’t mean highly secure.
Our group of security researchers at the University of Washington was able to remotely compromise and control a highly computerized vehicle. Theyinvaded the privacy of vehicle occupants by listening in on their conversations. Even more worrisome, they remotely disabled brake and lighting systems and brought the car to a complete stop on a simulated major highway. By exploiting vulnerabilities in critical modules, including the brake systems and engine control, along with in radio and telematics components, our group completely overrode the driver’s control of the vehicle. The safety implications are obvious.
This attack raises important questions about how much manufacturers and consumers are willing to sacrifice security and privacy for increased functionality and convenience. Car companies are starting to take these threats seriously, appointing cybersecurity executives. But for the most part, automakers appear to be playing catchup, dealing with security as an afterthought of the design process.
Home insecurity
An increasing number of devices around the home are automated and connected to the Internet. Many rely on a proprietary wireless communications protocol called Z-Wave.
Two U.K. researchers exploited security loopholes in Z-Wave’s cryptographic libraries — that’s the software toolkit that authenticates any device being connected to the home network, among other functions, while providing communication security over the Internet. The researchers were able to compromise home automation controllers and remotely controlled appliances including door locks and alarm systems. Z-Wave’s security relied solely on keeping the algorithm a secret from the public, but the researchers were able to reverse engineer the protocol to find weak spots.
Our group was able to compromise Z-Wave controllers via anothervulnerability: their web interfaces. Via the web, we could control all home appliances connected to the Z-Wave controller, showing that a hacker could, for instance, turn off the heat in wintertime or watch inhabitants via webcam feeds. We also demonstrated an inherent danger in connecting compact fluorescent lamps (CFL) to a Z-Wave dimmer. These bulbs were not designed with remote manipulations over the Internet in mind. We found an attacker could send unique signals to CFLs that would burn them out, emitting sparks that could potentially result in house fires.
Our group also pondered the possibility of a large-scale terrorist attack. The threat model assumes that home automation becomes so ubiquitous that it’s a standard feature installed in homes by developers. An attacker could exploit a vulnerability in the automation controllers to turn on power-hungry devices — like HVAC systems — in an entire neighborhood at the same time. With the A/C roaring in every single house, shared power transformers would be overloaded and whole neighborhoods could be knocked off the power grid.
Harnessing hackers’ knowledge
One of the best practices of designing elegant security solutions is to enlist the help of the security community to find and report weak spots otherwise undetected by the manufacturer. If the internal cryptographic libraries these devices use to obfuscate and recover data, amongst other tasks, are open-source, they can be vetted by the security community. Once issues are found, updates can be pushed to resolve them. Crypto libraries implemented from scratch may be riddled with bugs that the security community would likely find and fix – hopefully before the bad guys find and exploit. Unfortunately, this sound principle has not been strictly adhered to in the world of the Internet of Things.
Third party vendors designed the web interfaces and home appliances with Z-Wave support that our group exploited. We found that, even if a manufacturer has done a very good job and released a secure product, retailers who repackage it with added functionality — like third party software — could introduce vulnerabilities. The end-user can also compromise security by failing to operate the product properly. That’s why robust multi-layered security solutions are vital – so a breach can be limited to just a single component, rather than a successful hack into one component compromising the whole system.
Level of risk
There is one Internet of Things security loophole that law enforcement has taken notice of: thieves’ use of scanner boxes that mimic the signals sent out by remote key fobs to break into cars. The other attacks I’ve described are feasible, but haven’t made any headlines yet. Risks today remain low for a variety of reasons. Home automation system attacks at this point appear to be very targeted in nature. Perpetrating them on a neighborhood-wide scale could be a very expensive task for the hacker, thereby decreasing the likelihood of it occurring.
There needs to be a concerted effort to improve security of future devices. Researchers, manufacturers and end users need to be aware that privacy, health and safety can be compromised by increased connectivity. Benefits in convenience must be balanced with security and privacy costs as the Internet of Things continues to infiltrate our personal spaces.
Temitope Oluwafemi is Ph.D. Student in Electrical Engineering atUniversity of Washington. This story is published courtesy of The Conversation (under Creative Commons-Attribution/No derivatives).
www.homelandsecuritynewswire.com 


Disaster Preparedness & Recovery : Sea-Level Dilemmas Quietly Swelling on First Coast, Fla

A corps of residents — some in local governments, some activists or policy nerds — are charting steps communities can take now to avoid being caught unprepared when the tide rises.

Jacksonville, Fla.
The Concerned Scientists researchers say instances of coastal flooding in Jacksonville could rise from an average of seven per year now to 25 in 2030 — and 101 in 2045. Shutterstock
(TNS) — About 75 square miles of Northeast Florida real estate could be inundated by rising seas within 25 years. Or not.

Water to cover that ground might not arrive for another 50 years, maybe longer.

But almost certainly, it will get here.

That realization prompts a corps of First Coast residents — some in local governments, some activists or policy nerds — to chart steps communities can take now to avoid being caught unprepared when the tide rises.

Their answers have run a gamut, from lobbying for coastal property-insurance reforms to moving Green Cove Springs’ police station to higher ground and learning steps to help Fernandina Beach’s historic properties manage flooding, a situation that a prominent science group says could happen dozens of times a year within 30 years in that town — and even more in Jacksonville.

People who backed those projects don’t know how much water to expect. But they’re trying to get ready, just the same.

“It’s kind of like insurance. If you do this stuff, you’re insuring against it,” said David Reed, a JEA employee who chaired a committee of volunteers that researched lessons about sea-levels for the Regional Community Institute of Northeast Florida, a nonprofit started by a regional planning council.

Their findings were adopted almost verbatim last year by the Northeast Florida Regional Council, a seven-county panel of elected officials who agreed they should prepare for seas rising somewhere between six inches and six feet.

When Hurricane Sandy slammed into New York City in 2012, flooding subways and causing an estimated $19 billion in losses, the impact from a 14-foot storm surge was magnified by high tides and by a 20-inch increase in seas since the late 18th century, scientists concluded last year. University of Florida geologists said this month that sea-level rise was helping erode dunes that protect two launch pads at Kennedy Space Center, although NASA has built replacement dunes.

An extra foot of sea-level in Northeast Florida would cover about 75 square miles of private property as well as inundate a lot of parkland, starting with chunks of the Timucuan Ecological and Historic Preserve. A six-foot rise would cover 123,000 acres of privately owned land — 192 square miles — worth $6.4 billion, according to estimates the Regional Council delivered to a follow-up committee of business people and government types with the wonky name P2R2 (Public/Private Regional Resiliency).

The committee, which meets again Friday, was asked to think about steps to “incentivize population and private development to locate outside of vulnerable areas.”

Talks like that often deliberately sidestep volatile questions about how much man-made pollution is driving climate changes.

Ocean levels are changing, say backers of planning efforts like the Regional Council’s, and how to handle the rising seas is problem enough for today.

“We are experiencing sea-level rise today, and we have been,” said Sarah Owen Gledhill, a St. Augustine-based planning advocate for the Florida Wildlife Federation. “We’re not debating whether sea-level rise was caused by human action or not, but we know it is happening and the scientists say it will get worse.”

Seas rose about eight inches globally since 1880, and are expected to rise another one to four feet by 2100, the federal government’s National Climate Assessment reported this year. Two factors — the fact that water expands when it gets warmer and the melting of polar ice as temperatures rise — are commonly named as main reasons for rising seas. Tide gauge readings taken at Mayport between 1928 and 2006 rose some months and dropped others, but overall suggested changes of about nine and a half inches per century.

Ordinary people haven’t been expected to say much about plans yet, because they haven’t been told much.

“Public education has not really begun in Northeast Florida,” a report produced through Reed’s committee said last year. It described a sort of survey being taken then and said that “the committee consciously designed them for public officials. Planners, city engineers, public works staff and utility staff made up the bulk of participants at assessments.”

But talk about sea level is percolating into more corners of the First Coast, a shift that St. Johns County resident Patrick Hamilton noticed when a staffer from the Army Corps of Engineers office in Jacksonville came to his Rotary club to talk.

“He painted a stark picture,” said Hamilton, a Realtor from Crescent Beach, who said members had split reactions. “When we went outside, some of them said ‘dang,’ and some said ‘I don’t believe that.’”

Hamilton, a longtime environmental advocate, was already thinking about the subject. This year, he wrapped up a role in a three-year review of how higher seas will impact areas around the sprawling Guana-Tolomato-Matanzas National Estuarine Research Reserve in St. Johns and Flagler counties. The answer, in a nutshell, was that water would build up in areas where a barrier stops its advance, then would eventually become high enough to get over that barrier and would start covering another area in a process called “terracing.” Drainage systems and sewer lines built below that terrace would back up and roadbeds would be undermined.

While there’s still time, Hamilton would like to see land preserved — there’s a proposal in the works now – so a Matanzas estuary that currently teems with fish can move inland through undeveloped areas as the ocean advances. To show how coastlines can change, he points to Summer Haven in southern St. Johns, where the road called Old A1A was washed out and closed decades ago, and more recent storms filled the Summer Haven River with enough sand that what’s left is barely a creek.

Others are focused on houses that a rising ocean would soak.

A lot are near the water already, and not building more in vulnerable areas would be a good step, said Gledhill, the Wildlife Federation advocate.

Congress decades ago started blocking federally-backed flood insurance for new homes in so-called “coastal high hazard” areas, but Gledhill said Florida continued to insure new construction in vulnerable areas. That changed in July because of passage of a law backed by a coalition of environmental groups, tax-watchers and business groups including the Florida Chamber of Commerce and the Associated Industries of Florida that barred state-backed Citizens Property Insurance Corp. from insuring houses in environmentally sensitive coastal areas.

Backers say the change could keep investors from building oceanfront homes that can be washed away before mortgages are paid off.

“If you’re going to develop there, that’s fine. But do it on your own dime,” Gledhill said.

St. Johns officials declared a local state of emergency Monday, saying in a release that “recent severe wind, lunar tides, and high waves have caused erosion that poses an immediate threat of substantial property damage to habitable structures.” The declaration triggers a state law letting the county issue temporary armoring permits for homes in imminent danger.

There’s a lot less sea level risk in Clay County, but the chance of flooding beside a rising St. Johns River still helped convince Green Cove Springs officials to put a new police station on Florida 16, well west of the old station beside the town’s Spring Park along the river.

The new station, which opened in April, also houses an emergency operations center, and getting that out of the town’s flood plain just made too much sense, said city manager Danielle Judd.

Fernandina Beach has gone farther than most First Coast towns in thinking about sea-level rise, writing into its comprehensive plan for 2030 that it “recognizes sea-level rise as a potential coastal hazard, and shall work with Nassau County and state and regional entities … to develop strategies for responding.”

Those steps could include analyzing sea-levels effects on wetlands, estuaries and beaches; identifying areas put at risk by higher water; and evaluating effects on the water table, public water systems and sewer systems.

A Jacksonville planner tracks Regional Council action on sea-level policies, but the city hasn’t adjusted any of its own plans yet, said Kristen Sell, a city spokeswoman. The city is working with state emergency management offices to see whether sea-level rise should affect its emergency plans, she said.

JEA had scheduled a review this year of how sea-level rise will affect its water systems, but pushed that back to the utility’s 2015-16 budget year, said spokeswoman Gerri Boyce.

One group is forecasting a lot of work for agencies that deal with flooding in Jacksonville and Fernandina, saying both communities are likely to be affected by changes along the East Coast. The Union of Concerned Scientists said in October that instances of flooding could triple in 15 years at most of the 52 cities its researchers examined between Maine and the Gulf Coast.

The forecast assumed the same sea-level increase for both cities — 4.7 inches by 2030 and 10.5 inches by 2045. That was based on projections about Fernandina by the website Climatecentral.org.

If that forecast is right, the Concerned Scientists researchers said instances of coastal flooding in Jacksonville could rise from an average of seven per year now to 25 in 2030 — and 101 in 2045.

The forecast said Fernandina would move from two flooding days per year to eight by 2030 and 37 in 2045.

Fernandina’s community development director, Adrienne Burke, said she’d like to arrange for someone from the Concerned Scientists to visit and talk more about the research.

But before the report came out, she was already trying to research how rising tides can be managed in historic areas like Fernandina, where buildings’ foundations have been in place a century or more and can’t get out of the way now. She’s taking advice from places that are already feeling effects, like the 18th-century section of Alexandria, Va., where the Potomac River periodically washes into historic buildings.

The subject came up recently at a project to restore a brick train depot built on Centre Street in 1899. One member of the restoration team was interested in a way to make the building more flood-resistant, while another focused on preserving its original design. The best answer they could settle on was to leave the doors open during floods so the water would pass through, and leave, as fast as possible.

Concerned Scientists raised concerns in the spring about the potential for sea-level damage in a range of historic areas, using St. Augustine’s Castillo de San Marcos as its poster child of threatened buildings.

Burke said there’s a lot still to work through about how residents should handle changing water levels, but it’s important to start the conversation.

“We already do see some flood events, and the community is aware of it,” Burke said. “We’re just beginning to talk about it.”

©2014 The Florida Times-Union (Jacksonville, Fla.). Distributed by Tribune Content Agency, LLC.
www.emergencymgmt.com 

Disaster Preparedness & Recovery : Controller for the Emergency Power Grid Is Coming

Energy groups are designing the controller, which will keep power flowing during a severe weather outage.

Superstorm Sandy knocked out power to more than 2.5 million people in New Jersey. Sharon Karr/FEMA
(TNS) — Clarkson University announced Wednesday that it and several energy groups are designing a controller for the emergency grid that someday will keep the power flowing during a severe weather outage.

The Enhanced Microgrid Control System (eMCS) is understood to be the brains of the planned grid, and will increase its efficiency and flexibility.

“New York state’s north country is a region where we have firsthand knowledge of the tremendous impact that weather can have on our utilities’ infrastructure,” Clarkson University President Anthony G. Collins said. “We are excited to be partnering in research that will have an impact not only on Clarkson’s neighbors, but also on communities like Potsdam around our state and nation, where severe weather can be disruptive to lives and commerce.”

The process will begin with 18 months of engineering and design done by General Electric Global Research. The National Renewable Energy Laboratory will then follow with six months of testing a microgrid that simulates Potsdam’s infrastructure.

A GE Global Research representative reported engineering at its research center in Niskayuna will begin in January.

The eMCS is important to keeping the local power grid running for days if it gets disconnected from the main state grid, and the grid it controls will connect approximately 12 entities to include emergency service providers, utilities, power generation sources, and staging areas, along with providers of housing, fuel and food, according to a college release.

The college did not say specifically what those entities would be.

The control system will use controllers operating at different times to efficiently provide a steady flow of electricity, and make the grid stable and secure.

“It’s a vital component and critical to the system’s resiliency and overall performance,” said Sumit Bose, the project’s principal investigator and microgrid technology leader at GE Global Research. “Together, GE’s control system — and the underground microgrid envisioned for the Potsdam community — could serve as a model for towns and cities across the country that are susceptible to weather disasters and blackouts.”

Other energy groups involved in the research include GE Energy Consulting, National Grid and the Department of Energy.

©2014 Watertown Daily Times (Watertown, N.Y.). Distributed by Tribune Content Agency, LLC.
www.emergencymgmt.com