Archive for month: November, 2018

How Supercomputers Can Help Fix Our Wildfire Problem – WIRED.com

Written by: 

FIRE IS CHAOS. Fire doesn’t care what it destroys or who it kills—it spreads without mercy, leaving total destruction in its wake, as California’s Camp and Woolsey fires proved so dramatically this month.

But fire is to a large degree predictable. It follows certain rules and prefers certain fuels and follows certain wind patterns. That means its moves with a complexity that scientists can pick apart little by little, thanks to lasers, fancy sensors, and some of the most powerful computers on the planet. We can’t end wildfires altogether, but by better understanding their dynamics, ideally we can stop a disaster like the destruction of Paradise from happening again.

You could argue that a wildfire is the most complicated natural disaster, because it’s both a product of atmospheric conditions—themselves extremely complex—and a manipulator of atmospheric conditions. So for instance, California’s recent fires were driven by hot, dry winds coming from the east. These winds dried out vegetation that was already dry due to lack of rainfall, fueling conflagrations that burn more intensely and move faster.

But wildfires also create their own weather patterns. Blazes produce hot air, which rises. “You can imagine that if something moves from the surface up, there must be some kind of horizontal movement of air filling the gap” near ground level, says Adam Kochanski, an atmospheric scientist at the University of Utah. Thus the fire sucks in surface winds.

Researchers are using supercomputers and lookout stations like this to model the dynamics of wildfires in real time. LOS ALAMOS NATIONAL LABORATORY

Wildfires don’t yet have the equivalent of a grand unified model to explain their behavior. The contributing factors are just so different, and work on such different scales—air dynamics for one, the aridity of local vegetation for another.

“That’s what’s really difficult from a modeling standpoint,” says Kochanski. You can’t hope to model a 50-square-mile wildfire with millimeter-scale resolution. So researchers like Kochanski simplify things. “We don’t really go into looking at how every single flame burns every single tree and how it progresses. No, we assume fuel is relatively uniform.”

Still, advances in computing are allowing researchers to crunch ever more data. At Los Alamos National Laboratory, atmospheric scientist Alexandra Jonko is using a supercomputer and a system called FIRETEC to model fires in extreme detail. It models, among other things, air density and temperature, as well as the properties of the grass or leaves in a particular area.

Jonko runs a bunch of simulations with different wind speeds, typically on the scale of 40 acres. “It’ll probably take me about four hours to simulate between 10 and 20 minutes of a fire spreading,” she says.

FIRETEC produces valuable physics-based data on fire dynamics to inform how fire managers do prescribed burns. This is pivotal for controlling vegetation that turns into fuel for fires. Wildfire agencies know generally the ideal conditions—low winds, for instance—but this type of modeling could help give even more granular insight.

FIRETEC’s modeling a portion of the 2011 Las Conchas fire near Los Alamos – LOS ALAMOS NATIONAL LABORATORY

To figure out where to do these burns, researchers are experimenting with lidar, the same kind of laser-spewing technology that helps self-driving cars find their way. This comes in the form of airborne lidar, which lets researchers visualize trees in 3D, supplemented with ground-based lidar, which details the vegetation underneath the trees.

That information is essential. “If we don’t know what the fuels are, then it’s a pretty big guess whether or not you’ve got dangerous fuels at a site,” says the University of Nevada, Reno’s Jonathan Greenberg.

The visualizations that come from lidar blasts are as stunning as they are useful. With this kind of data in hand, managers can more strategically deploy prescribed burns. California in particular has a serious problem with fire resources—in just the last year, the state has seen seven of its 20 most destructive fires ever. Money, then, goes to constantly fighting the infernos, leaving fewer resources for proactive measures like prescribed burns.

Another way to go about modeling fires is with reinforcement learning. You might have heard of researchers using this to get robots to learn—instead of explicitly showing a robot how to do something like putting a square peg in a square hole, you make it figure it out on its own with random movements. Essentially, you give it a digital reward when it gets closer to the correct manipulation, and a demerit when it screws up.

A model showing wind. The faster the wind, the longer and redder the arrows. LOS ALAMOS NATIONAL LABORATORY

Turns out you can do the same thing with virtual fire. “It’s kind of like Pavlov’s dog,” says computer scientist Mark Crowley of the University of Waterloo. “You give it a biscuit and it will do that trick again.”

Crowley begins with satellite thermal images that show how a wildfire has burned over an area. Think of this as the simulated fire’s “goal,” like a robot’s goal is to get the peg in the hole. This approach is still in its early days, and Crowley is busy helping his artificial flames learn the art of being fire. If it accurately mimics how a real fire ended up traveling, the algorithm gets a digital reward—if not, it gets a demerit. “Then over time you update this function so it learns how to travel properly,” Crowley adds. In a sense, he can create a digital fire infused with artificial intelligence.

Out in the field, researchers are using a supercomputer at UC San Diego to confront the immediate threat of wildfires, with a program called ALERTWildfire. On mountaintops across California, lookout stations are loaded with sensors like high-def cameras and wind and moisture detectors. If the camera catches a fire breaking out, the system can pipe that atmospheric data to the supercomputer, which does real-time modeling of the blaze for fire agencies.

A terrestrial laser scanner image of a forest after a fire. UNR/USFS RSL/USFS FBAT/UMD/UE

“They can see where the fire is going, what it’s going to look like in the near term and long term, and then continue to receive live updates,” says Skyler Ditchfield, co-founder and CEO of GeoLinks, a telecom that’s partnered with the project.

Why a supercomputer? “The magic word here is fast,” says Ilkay Altintas, chief data science officer at the San Diego Supercomputer Center. Wind-driven fires move quickly, and the bigger a fire gets, the more data it produces. “The computational complexity can depend on how big the fire is, how complicated the topography is, how the weather is behaving.”

As the detection network grows—85 cameras are deployed right now, but the researchers hope to expand to over 1,000 across California—so too does the torrent of data. Also, at the moment, human eyes have to watch the camera feeds to detect fires, though the idea is to get AI to do that in the future.

Tech won’t solve all our wildfire problems—we need to band together to reinforce our cities, for instance. But with ever more data and computing power, and ever better models, we can get better at confronting the wildfire menace. Fire is chaos, but it’s not impossible to understand.

ALERTWildfire Collaborates With GeoLinks to Deploy Fire Detection and Prevention Systems Across California

ALERTWildfire Partners With GeoLinks to Deploy Fire Detection and Prevention Systems Across California

CAMARILLO, Calif.–(BUSINESS WIRE)–ALERTWildfire, a consortium of three universities — The University of Nevada, Reno (UNR), University of California, San Diego (UCSD), and the University of Oregon (UO) — announced today it has officially partnered with California-based Telecom, GeoLinks, to deploy Wildfire Detection, Prevention, and Situational Awareness Systems across the state of California. With ample endorsement from the United States Forest Service (USFS), the Bureau of Land Management (BLM), California’s new Governor-elect Gavin Newsom, and a multitude of utilities and state counties, the project demonstrates the future and next step in advanced firefighting and suppression.

Demand for the rapid system expansion was inspired by a new wildfire camera pan-tilt-zoom technology (PTZ), developed by Graham Kent, Director of the Nevada Seismological Lab at the University of Nevada, Reno, that became instrumental in both the response and containment of the 2017 Lilac Fires in San Diego County. GeoLinks plans to deploy 28 additional PTZs by year’s end.

“ALERTWildfire is excited to work with GeoLinks as their resilient communications network throughout California enables a rapid deployment of fire cameras in critical regions of the state,” said Kent. “No other service provider is able to scale to this urgent task, and we look forward to dozens of cameras to be installed month-after-month as we ready ourselves for December 2018 and fire season 2019.”

Collocated across GeoLinks’ vertical assets in greater LA-Metro, Orange County, Riverside County, and Ventura County, the data collected from the PTZ cameras will be backhauled over GeoLinks’ ClearFiber™ network to WIFIRE at the San Diego Supercomputer Center in UC San Diego. WIFIRE, an integrated system for wildfire analysis, analyzes data collected from these cameras to create real-time simulations, wildfire path predictions, and visualizations of wildfire behavior. Ultimately, the system will provide strategic advantages for early fire detection, situational awareness for first responders, fire mapping, predictive simulations, and evacuation planning.

UC San Diego has already identified the next wave of key sites for GeoLinks to connect post initial project completion. Skyler Ditchfield, co-founder and CEO of GeoLinks, notes that with the comprehensive coverage of fixed wireless broadband that will accompany the camera network, LTE-based data connectivity and the extension of all first responder handheld radio systems can be efficiently added to close all connectivity gaps.

“The fact remains that California is now faced with wildfires year-round,” stated Ditchfield. “Wildfire detection, prevention, and situational awareness systems provide a solution that could make an immediate, lasting, and radical impact on the spread of fires and associated costs, damages, and casualties. GeoLinks, ALERTWildfire, and a variety of other affiliates across the state, including CENIC, are actively pushing the state-wide expansion of these technologies. If we had assets installed prior to the Camp Fire’s ignition, for example, we could have saved countless lives. This is really the future and next step in advanced firefighting and suppression.”

For media inquiries or interview requests, please contact Lexie Smith at [email protected].

About ALERTWildfire

ALERTWildfire is a consortium of three universities — The University of Nevada, Reno (UNR), University of California, San Diego (UCSD), and the University of Oregon (UO) — providing access to state-of-the-art Pan-Tilt-Zoom (PTZ) fire cameras and associated tools to help firefighters and first responders: (1) discover/locate/confirm fire ignition, (2) quickly scale fire resources up or down appropriately, (3) monitor fire behavior through containment, (4) during firestorms, help evacuations through enhanced situational awareness, and (5) ensure contained fires are monitored appropriately through their demise.

About GeoLinks

Headquartered in Southern California, GeoLinks is a leading telecommunications company and competitive local exchange carrier (CLEC) public utility, nationally recognized for its innovative Internet and Digital Voice solutions. Ranked first in category on Inc. Magazine’s Inc. 5000 Fastest Growing Companies in America in both 2017 and 2018, GeoLinks delivers Enterprise-Grade Internet, Digital Voice, SD-WAN, Cloud On-ramping, Layer 2 Transport, and both Public and Private Turnkey Network Construction expertly tailored for businesses and Anchor Institutions nationwide.

GeoLinks’ accelerated success is largely due to its flagship product, ClearFiber™, which offers dedicated business-class Internet with unlimited bandwidth, true network redundancy, and guaranteed speeds reaching up to 10 Gbps. Named “Most Disruptive Technology” in the 2018 Central Coast Innovation Awards, GeoLinks’ ClearFiber™ network is backed by a carrier-grade Service Level Agreement boasting 99.999% uptime and 24/7 in-house customer support. With an average installation period of 4 to 7 days, GeoLinks is proud to offer the most resilient and scalable fixed wireless network on the market.

Expanding Flexible Use of the 3.7 GHz to 4.2 GHz Band

Before the

Federal Communications Commission

Washington, DC  20554

Expanding Flexible Use of the 3.7 GHz to 4.2 GHz Band - GeoLinks

COMMENTS OF CALIFORNIA INTERNET, L.P. DBA GEOLINKS

California Internet, L.P. DBA GeoLinks (“GeoLinks” or the “Company”) submits these reply comments in response to comments filed on the Notice of Proposed Rulemaking (“NPRM”) released in the aforementioned docket.[1]

  1. INTRODUCTION

GeoLinks is the fastest growing Internet and phone provider in America and the fastest growing telecom in California.  In addition, GeoLinks was recently awarded Connect America Fund Phase II Auction funding to serve 3883 Census Blocks in California and Nevada.  The Company has a vested interest in ensuring that the FCC’s policies allow competitive broadband providers to access vital spectrum resources and believes that the 3.7-4.2 GHz band provides opportunity for such access, subject to certain rules and requirements.

  1. DISCUSSION

  2. GeoLinks Supports the BAC’s Proposed Solution to Allow Spectrum Access for Fixed Wireless Providers in the 3.7-4.2 GHz Band

Millions of Americans lack what is considered, by today’s standards, highspeed broadband access – especially in rural areas.  As GeoLinks has previously advocated, sparsely populated rural areas are not well suited for traditional, wired broadband service given the cost to build and deliver a cable/ fiber-based network, often resulting in these areas being left on the wrong side of the digital divide.  However, fixed wireless broadband technology can provide highspeed broadband to consumers in these areas for a fraction of the cost of traditional, wired networks. In addition, fixed wireless providers can (and do in some areas) offer competitive choice to consumers in urban and suburban areas.

Like other fixed wireless providers, GeoLinks’ technology platform depends on access to spectrum resources sufficient to support enterprise-level broadband connections. While spectrum resources do exist that have allowed fixed wireless providers to successfully deploy internet services in some areas, these resources have primarily been available on an unlicensed basis only.  Unlicensed bands are not a one-size-fits-all option as they are often subject to congestion and interference that can degrade wireless signals.

In order for fixed wireless broadband providers to truly compete with traditional, wired service providers, additional spectrum resources are needed. GeoLinks believes the 3.7-4.2 GHz band offers an opportunity for the Commission to allocate spectrum resources in a way that will promote competition and help bridge the digital divide while protecting current users of the band.

The BAC has set forth a “win-win-win solution that: (1) protects incumbent FCC operators from harmful interference; (2) clears a portion of the band for exclusive flexible use licensing; and (3) enables fixed P2MP broadband providers to deploy badly needed high-throughput broadband to unserved and underserved customers.”[2]  GeoLinks believes that this proposed solution strikes the right balance with respect to spectrum sharing, frequency coordination, buildout requirements, and Point-to-Multipoint (“P2MP”) deployment.  As such, GeoLinks supports the opening comments submitted by the BAC in response to the NPRM.

  1. The Commission Should Reject Any Arguments that Fixed Wireless Providers Already Have Access to All the Spectrum Resources They Need

GeoLinks urges the Commission to reject any argument that the spectrum resources that fixed wireless providers have now are “good enough.”  This status-quo mentality is exemplified in comments that appear to suggest that fixed wireless providers have all the spectrum they need or will get it eventually, so there is no need to look to the 3.7-4.2 GHz band for more.  Specifically, the C-Band Alliance explains that “any legitimate requirement for more spectrum for P2MP networks can be met using bands that are either currently available or are being considered for such operations.”[3]

GeoLinks strongly disagrees that fixed wireless providers have enough spectrum already.  As explained above, currently fixed wireless providers primarily have access to only unlicensed spectrum.  In situations where only unlicensed spectrum is available, most connections are limited to point-to-point (“P2P”) connections over short distances to avoid interference with other users.  While fixed wireless providers have had success with these P2P connections, considering them “good enough” fails to account for all of the benefits that the technology couldprovide.  First, even with extensive engineering and coordination, there is no guarantee that interference won’t occur at some point over unlicensed spectrum bands.  This is especially true in densely populated, urban areas where there are numerous users in the unlicensed band.  This interference can make it difficult and costly to engineer a dedicated link to a customer to ensure enterprise-grade broadband service – a service that a fixed wireless provider mustoffer to be competitive in urban markets.  Second, P2P connections require expensive transmission equipment for each link (vs. one for multiple links).  These costs can make it difficult for fixed wireless providers to competitively price broadband services, especially in residential markets where P2P equipment may be cost prohibitive for residential subscribers.

GeoLinks has advocated for the benefits of P2MP services in numerous filings before the Commission.  This technology creates opportunities to connect multiple users in a more cost-effective manner (even if miles apart), making it ideal for serving multiple customers in one area at a lower cost.  Despite the benefits of this technology, however, current spectrum policies hinder fixed wireless providers’ ability to take advantage of it.  For example, P2MP connections are more susceptible to congestion and interference caused from extensive use of the unlicensed bands, especially in urban, highly-populated areas. This makes high-quality P2MP connections over unlicensed spectrum nearly impossible in some areas, clearly refuting the concept that fixed wireless providers have all the spectrum they need.

Moreover, while there are a number of active proceedings before the Commission that may provide fixed wireless providers the ability to access additional licensed, light-licensed, or shared spectrum resources, many of those proceedings are also considering whether specific spectrum bands are better used for other uses (e.g. mobile wireless).  In addition, the outcomes of those proceedings are still very much pending before the Commission and the Commission should not foreclose the option of fixed wireless use in the 3.7-4.2 GHz band just because spectrum might be available in another band at some point.

The BAC’s suggested solution for the 3.7-4.2 GHz band addresses the current spectrum limitations experienced by fixed wireless providers by proposing practical options for P2MP use within the band that will not interfere with existing use by FSS Operators.  The Commission should reject any arguments that fixed wireless providers have enough spectrum now (or will eventually) and therefore the Commission should not consider expanded use of the 3.7-4.2 GHz band.  Instead, GeoLinks urges the Commission to look to implement the BAC’s proposal and adopt spectrum policy that promotes innovation and competition.

  1. The Commission Should Adopt Robust Build-Out Requirements for the Band

As GeoLinks has advocated before, the Company believes that spectrum rights should be subject to robust build-out and “use it or lose it” requirements.  In its opening comments, the BAC supports the NPRM’s 12-month build-out period and proposes other build out requirements including limitations on channel reservation periods, minimum build-out standards for P2MP licensees, and limitations on P2MP spectrum use until build out is complete.[4]  GeoLinks supports these suggested build-out requirements and urges the Commission to adopt them.

  • CONCLUSION

GeoLinks supports the BAC’s opening comments submitted on the NPRM and urges the Commission to adopt its win-win-win proposal for the 3.7-4.2 GHz band.

 

Respectfully submitted,

GEOLINKS, LLC

/s/ Skyler Ditchfield, Chief Executive Officer

/s/ Melissa Slawson, General Counsel/ V.P of Government Affairs and Education

 

November 27, 2018

[1]Expanding Flexible Use of the 3.7 to 4.2 GHz Band, Order and Notice of Proposed Rulemaking, GN Docket No. 18-122, FCC 18-91 (rel. July 13, 2018) (“NPRM”).
[2]Comments of the Broadband Access Coalition, GN Docket 18-122 (filed October 29, 2018) (“BAC Comments”) at 3.
[3]Comments of the C-Band Alliance, GN Docket 18-122 (filed October 29, 2018) at 45.
[4]SeeBAC Comments at 25.

Disaster Recovery Plan – The Only Way to Ensure Business Continuity

what is a disaster recovery plan - geolinks

Having a disaster recovery plan in place is one of the most essential parts of running a successful business. Just like business liability insurance, disaster recovery planning for your network ensures ongoing business continuity. Whether your disaster recovery plan is for site mirroring, load balancing or just staying online, it is the responsible thing to do for all business owners, CIOs, and IT managers.

This month, California witnessed one of the deadliest and most destructive wildfires in state history – the Camp Fire. Located in the city of Paradise, California, this tragic disaster resulted in massive loss of life, structure, property, infrastructure, and habitat. Southern California also experienced two  horrific wildfires, the Woolsey Fire and Hill Fire. These fires tore through both Los Angeles and Ventura Counties, taking down just near everything in their paths. Cities near the burn areas, while not officially evacuated, experienced county-wide network outages. That said, businesses with a disaster recovery plan have proven resilient. So, what exactly is a disaster recovery plan?

What is A Disaster Recovery Plan?

Disaster recovery planning entails outlining how to recover your business operations during or after a disaster. No business is immune to disaster, so having a plan in place protects your business from large financial losses, and in extreme cases, bankruptcy. While it may appear to be a daunting task, business owners will be happy they had one ready for when disaster strikes. So how do you go about planning for a disaster? First, it’s worth exploring Business Interruption insurance, this coverage insures the revenue losses a business might suffer in the case of a disaster. Next, consider following this quick checklist provided by Q Finance: The Ultimate Resource:

  • Business Impact Analysis:

    1. This is where you identify what parts of the business will be most impacted by a disaster.
    2. Calculate how much this will cost you if you lost them in a disaster for a day, a week, and two weeks.
    3. Next, identify the maximum threshold your business can tolerate before being threatened with closure.
    4. List the minimum activities required to deliver identified parts of the business.
    5. Make sure adequate resources are available to provide those activities.
  • Risk Assessment Analysis:

    1. Identify what the risks are to the organization, such as loss of staff, suppliers, IT systems, and telecommunications.
    2. List plans already in place to deal with each risk.
    3. List plans that need to be put in place to deal with each risk.
    4. Assign a “likelihood to occur” score or probability to each risk.
  • Decide on what action to take for each identified risk:

    1. Deal with a risk by planning to operate at a minimum level.
    2. Tolerate the risk if the cost of reducing operations outweighs the benefits.
    3. Transfer the risk to a third party or take out insurance.
    4. Shut down / terminate the activity.
  • Write then share the plan(s):

    1. Start by writing a general plan, then decide if you need more detailed plans within that one.
    2. Write a scope and purpose for each plan.
    3. Identify the resources and contacts that own each plan and are responsible for it.
    4. List their contact details.
    5. List tasks, processes, and procedures used to respond to an incident.
    6. For business continuity, list the identified critical activities, how to recover them, and the timeline involved.
  • Test, update, and maintain plans:

    1. Plans must be tested. That is the only way to ensure that the plan can work in the real world as well as it works on paper.
    2. Involve staff and have them go through the plan and recommend improvements.

Telecommunications and Disaster Recovery Planning

The modern world has become extremely interconnected, especially now with online transactions largely taking over physical transactions. With most business activities occurring over a telecommunications network, companies depend on the reliability of their Internet connections now more than ever for business continuity. Not having a backup Internet connection, or one that guarantees uptime and redundancy, can cause major financial losses.

For example, if a brick and mortar store or restaurant loses their Internet connection, their POS System will crash. If a businesses POS system is out of order, they will be unable to charge customers for products and services. A major disaster might mean your business is delayed or completely halted for days or weeks at a time. This is, if a proper plan is not in place.

What if you are an e-commerce company? If you lose your Internet connection you will not have access to the online orders customers are placing. Delaying processing orders will delay shipping orders, which will result in upset customers and a domino effect that is sure to affect your ability to gain new customers.

At a minimum, organizations should have a disaster recovery plan for their telecommunications infrastructure. There are several ways telecommunications companies can guarantee uptime. GeoLinks’ ClearFiber™ network, for example, offers a Service Level Agreement (SLA) that guarantees 99.999% uptime. To achieve 100% uptime, businesses are able to bundle in technologies such as LTE failover.

Long Term Evolution (LTE) and Business Continuity

LTE became a reality in 2010, and it was a big deal for the telecommunications industry. It provided much-needed low-latency, high-speed, reliability and power efficiency to wireless networks. LTE networks are leaps and bounds better than their 2G/3G predecessors.

LTE is the reason why we can have a gig economy with Uber, Lyft, and delivery services like GrubHub and DoorDash. It is also a wireless equivalent to a physical line. A well-designed network utilizes various types of technologies that can be depended on during different situations. For example, GeoLinks’ dedicated fixed wireless network, ClearFiber™, is connected to a fiber-optic backbone, and has the ability to failover to a LTE connection. Switching over to LTE is not like switching over to traditional mobile networks. Its low latency and fast speeds provide you with uninterrupted service, especially in times of disaster.

GeoLinks is proud to report our network has remained connected during California’s catastrophic fires. In fact, we are honored to be servicing CAL FIRE and Red Cross Evacuation Centers across Ventura County. If there is one thing California businesses should take away from the new year-round fire season, it’s that you must have a disaster recovery plan in place. At the bare minimum, have a plan for your telecommunications infrastructure and how to connect to the Internet.

To learn more about GeoLinks Disaster Recovery Solutions, call and talk to one of the GeoLinks’ team members today! (888) 225-1571

Wireless Smart Farming to Keep Frost Away From Citrus

UCSB SmartFarm sensor approximately 5 feet off the ground surrounded by citrus will help UC researchers know when to turn on windfans to protect plants from frost.

By Susan Rambo.

Computer science researchers from the University of California, Santa Barbara, are using the internet of things to prove that smart farming can be a farm implement as basic as the tractor and plough.

The husband and wife team of Chandra Krintz and Rich Wolski, both UCSB computer science professors, think data analytics can help tackle some of the tough challenges of modern agriculture. They want to apply the predictive mathematical leaps used in modern internet commerce to predict what people will buy, to agriculture. The pair created the UCSB SmartFarm program in response to what they see as the main issues of agriculture.

Krintz and Wolski cite U.S. Department of Agriculture and United Nations Food and Agriculture Organization studies that say some scary stuff: increasingly more food is needed to feed the growing global population, and yet farm labor is in short supply or too expensive. Eighty percent of the fresh water and 30% of global energy is used to produce food, half of which we waste in spoilage. Farming also has some particularly tough foes: Pests and disease attack farms’ output and farm land is subsiding (sinking) — especially in California — because of groundwater overdraft. On top of all that, agriculture makes 22% of greenhouse gases.

The only way smart farming can make a dent on those issues is to attack specific problems. For Krintz and Wolski’s first test projects, they talked to the farmer — in this case, farm researchers — first before designing a system. Although almost every ag tech pitch begins with a summary of those issues, the UCSB computer scientists’ approach is to come up with scientifically vetted data about the usefulness of cloud and data analytics in farming.

The design parameters of behind UCSB SmartFarm’s Farm Cloud System is to make a system a farmer could love: it should be easy to use and work reliably, cheaply and privately — farmers don’t want their data accessible. The system needs to provide useful data to help increase yield, automate farm operations or save money (or all three), and the data must be available real time. The whole thing has to work without IT staff.

The self-managing system needs to work like an appliance, like your refrigerator, write Krintz and Wolski in a presentation about the project.

Krintz and Wolski are testing the system on nut trees at Fresno State and on citrus at the University of California’s Lindcove Research and Extension Center (LREC) near Visalia, Calif. The UCSB SmartFarm program has support from Google, Huawei, IBM Research, Microsoft Research, the National Science Foundation, National Institutes of Health and the California Energy Commission.

RCR Wireless News visited the LREC — a literal test bed for citrus and smart farming — and got the full tour of the UCSB’s Farm Cloud System.

Lindcove’s research mandate

The public is probably not aware that agricultural research centers, such as LREC (Lindcove), do the hard science that protects our food. In the case of Lindcove, hard science is the study of mostly citrus trees, and it means the grueling work of studying each tree.

Dr. Beth Grafton-Cardwell, research entomologist, an integrated pest management (IPM) specialist and Lindcove’s director remembers sorting fruit by hand.

“When I first started in 1990, if we harvested in January, we would stand in the field in our long underwear and they would pick fruit into a bin and we would have ring sizers that told us what size the fruit was. We would count the fruit and size the fruit and write it on a clip board on a piece of paper,” she said. “Now this machine can do this better.”

Standing near a huge packing line machine that dwarfed her, Grafton-Cardwell explained how the cameras and the extra sensors enable the machine to size and weigh the fruit, examine the outside of the fruit using three types of cameras and estimate the sugar levels inside. One piece of fruit goes through the machine at a time, for scientific purposes, which differs from how a normal packing house operates.

“If I am a researcher, each of my trees is a replication and a different situation, so I want to know everything there is to know about the fruit on that tree,” said Grafton-Cardwell. The cameras take about 30 photographs of each piece of fruit, rotating the fruit as they go. Every parameter from each piece of fruit is put into a spreadsheet: “We know the size, the shape, if it has scarring, the precise color,” said Grafton-Cardwell.

The growers paid for Lindcove’s packing line. “We can simulate anything you want to do on a commercial pack line,” said Grafton-Cardwell. Most packing houses have these machines but don’t use them the way researchers do. They use them for sorting fruit, not for collecting the precise data the researchers need.

“You have to train the machine to the colors and the blemishes. It can get overwhelming,” said Kurt Schmidt, Lindcove’s principal superintendent of agriculture. “We can slow everything down and gather an infinite amount of data.”

“The data sets are ginormous,” Grafton-Cardwell pointed out. Data and an interpretation of the data is the really the product that Lindcove produces.

Originally started in 1959 by University of California Riverside and San Joaquin Valley citrus growers, Lindcove helps growers try out treatments and crop varieties without experimenting on their own crops, which protects their orchards — and livelihood. “Researchers from around the state can come here and do experiments,” said Grafton-Cardwell. Lindcove focuses on creating new varietals and demonstrating gardens of hundreds of citrus — a demo garden that is repeated in several other locations, such as the desert, for comparison. The center is working on 30 research projects right now.

“Citrus grows quite easily statewide….there are 300,000 acres [planted] statewide. It’s all fresh market, [California growers] don’t do juice. If the growers produce for juice, they lose money,” said Grafton-Cardwell. Florida and Brazil are the juice producers.

“Their climate produces a better juice fruit,” said Schmidt.

Lindcove is one of nine research centers in the University of California’s Agriculture and Natural Resources (ANR) department. With soil and climate typical for the commercial citrus growing in the Central Valley of California, the Lindcove’s 175 idyllic acres may be tucked remotely against the Sierra foothills on the road to Sequoia National Park, but it’s on the forefront of fighting some pretty scary citrus pests.

The Huanglongbing (HLB) bacterium has the citrus industry in California in an increasing panic. This bacterium, spread by the Asian citrus psyllids, a small bug imported from Asia, has already made its way up through Mexico and is now in Southern California and spreading northward.

Huanglongbing, also known as citrus greening disease, is killing trees at alarming rates and there is no cure yet. “It has devastated Florida. Huanglongbing has knocked their acreage down by 50 percent,” said Grafton-Cardwell. “We are trying to get some proactive research going to prepare for the arrival of the disease in the commercial citrus. Right now it is just in residential backyards, but it is going to get to the commercial citrus in the near future,” said Grafton-Cardwell.

In California, it is particularly hard to control because of the prevalence of backyard citrus trees.

“Right now it is just in Southern California. We are up to about 650 trees in Southern California that tested positive,” said Grafton-Cardwell. All of those infected trees were in residential yards. Therein lies the problem: An estimated 60% of homeowners have a citrus tree in their yard. “That’s like 15 million citrus trees. How do you manage a disease when you’ve got 30 million commercial trees and 15 million residential trees? It is very difficult,” she said. “Homeowners don’t understand plant disease, they don’t understand how to manage the pest, they don’t understand the risk.”

Unrelated to HLB, but nonetheless an insurance policy for all citrus growers, is Lindcove’s Citrus Clonal Protection Program (CCPP) out of UCR. Lindcove preserves and archives original budwood of citrus varieties as part of CCPP. Large screenhouses — greenhouses with screens instead of glass — hold clean budwood, which nurseries, growers and even citrus enthusiasts can use to propagate citrus plants. The citrus buds are grafted to rootstock and grown into trees in the screenhouses, where they are protected from insects.

The screens on these structures are “rated for thrips” — so fine that thrips or psyllids can’t get through it. Recently when one of the screens had a breach, the CCPP program restarted all the trees in the screenhouse to make sure they were free of insects and disease. This is serious business.

First, the network

Lindcove has a new network capability now. “We are really excited,” said Dr. Grafton-Cardwell. “It has taken us ten years to get to the point where we have a network that can support all this, because we are out in the boonies.”

Lindcove now uses the fiber network from CENIC — the non-profit network operator for the California universities, colleges, schools and libraries — and fixed wireless company GeoLinks for last-mile wireless.

“We were getting our internet from a local provider here in Visalia with limited bandwidth for a lot of money,” said Schmidt. “So now we’ve got this big connection that has the potential to have a large bandwidth. We’re in pretty good shape.”

“ANR pushed really hard in the last couple years to develop the funding to do this for all the research and extension centers, all nine of them, because we were all created back in the 1950s, and most of us in the boonies, and none of us had decent network capability. For scientists in this day and age to do research, it is totally revolutionary,” said Grafton-Cardwell. “When I first came in 1990, we weren’t able to do any of this stuff. Computing was really primitive and now it is going to improve what we do.”

Smart farm at Lindcove

“I didn’t even know what the internet of things was before Rich Wolski explained it,” said Grafton-Cardwell, but now she can’t wait to get it.

The goal of the UCSB’s smart farm test at Lindcove is to improve the decision making for frost protection for citrus growers, which should help reduce costs and carbon footprint.

Schmidt pointed out the culprit: the big wind machines on citrus farms. These wind machines are needed because the typical inversion layer of warmer air holds cold air to the ground, which damages fruit. The wind machines circulate the air when frost is imminent. It costs a lot to run the wind machines, which run on propane. That’s not even counting the cost of having to run around to the fields in a truck, taking temperature readings at all hours to make a decision when to turn on the wind machines.

Krintz and Wolski’s team of students have installed low-cost, sturdy weather stations that can withstand the elements and accurately sense temperature and humidity at 5 feet and 30 feet from the ground. The stations are installed to be able to monitor 3 feet from the boundaries of where the wind machines cover. The poles also have surveillance cameras with infrared capability to allow more temperature measurement, beyond regular thermometers. A network station in the field moves the data to the office on-site. Drones could be used “on the fly” to monitor at different levels.

Measuring and estimating the evaporation and transpiration under the tree canopy and sending that data to the office means that someone like Kurt Schmidt won’t have to manually take the temperature every hour at all hours, to determine when to turn on the fans. Also, tapping into Schmidt’s knowledge of when the fans need to be turned on will help inform the system; Krintz and Wolski can write software to automate the fans operations. Having more detailed information in real time means saving fuel if one windfan on one end of a microclimate doesn’t need to be turned on, even though others may need to run.

This frost experiment is only the beginning.

“We have a laboratory here that has equipment in it that again, we could be connecting,” said Grafton-Cardwell. “One of the things I proposed to Chandra [Krintz] and Rich [Wolski], is we have all these data in separate units. The pack line generates data, we are collecting data from the field. That is going into files. The data aren’t connected in any shape or form.”

Grafton-Cardwell’s ultimate goal is to have a researcher go into a portal and view all the data associated with their research.

This article originally appeared in the RCR Wireless News, July 17, 2018, and is re-posted with permission in the UC IT Blog.

Photo of Susan RamboSusan Rambo covers 5G for RCR Wireless News.

Fighting Fire with Data: Wildfire Detection, Prevention, & Situational Awareness Systems

Original Source

The unprecedented scale and scope of recent catastrophic wildfires show that larger swaths of California are at risk than previously understood. Smart investments in strategic technologies may serve to limit the loss of life and property damage. One promising — and proven — line of defense is connecting remote cameras and weather sensors across the state to a vast mesh of wireless and fiber-optic cable to relay data. The collected data is combined and analyzed to produce information that supports wildfire prevention, detection, and management.

This system — ALERTWildfire (University of Nevada, Reno, University of California, San Diego, and the University of Oregon) — is actively collaborating and partnering with local firefighters, GeoLinks, and CENIC. During the 2016-2017 fire seasons, such a system provided critical information for over 350 fires, and in 2018, has assisted in more than 150 fires so far.

Statewide expansion of this proven system would offer strategic advantages for early fire detection, situational awareness for first responders, fire mapping, predictive simulations, and evacuation planning. Rapid investment in this shovel-ready system would soon save lives, property, habitat, and infrastructure across California, and the state would see an almost immediate return on its investment. Additional partners that would benefit from this effort and so might be approached for financial support are the insurance industry, technology accelerators, and local community organizations.

How It Works

ALERTWildfire uses a network of cameras to continuously capture images of high-risk California landscape, while weather sensors on many of the same towers collect data on wind, humidity, fuel moisture, and other factors. The data is passed along via GeoLinks’s fixed wireless microwave technology and then handed off to CENIC’s high-capacity, optical-fiber network that runs throughout California. WIFIRE then analyses the data to create real-time simulations, wildfire path predictions, and visualizations of wildfire behavior and provides these visuals to firefighters to inform evacuation and containment planning. Data visualization is also supported by the California Institute for Telecommunications and Information Technology’s (Calit2) Qualcomm Institute.

For example, early fire detection by ALERTWildfire provides immediate input to burn models that incorporate weather, fuels, and topography. Such a collaboration exists between ALERTWildfire and WIFIRE (San Diego Supercomputer Center) to provide first responders with burn models almost in real time. WIFIRE was launched in October 2013 with a grant from the National Science Foundation and has been advised by representatives from CAL FIRE, US Forest Services, US Bureau of Land Management, National Institute of Standards and Technology, and Los Angeles Fire Department. WIFIRE’s “Firemap” software rapidly and accurately predicts and visualizes wildfire rates of spread. In late 2017, over 800,000 public users accessed information with the Firemap tool over 8 million times. Since grant funding ended this year, WIFIRE is operating under an annual subscription model for the fire departments of Los Angeles, Orange, and Ventura Counties.

What Is Needed Now

While these efforts have prevented significant loss of life and property during recent California wildfires, this fire monitoring network is geographically limited in its current deployment. Now is the time to expand the use of this proven system across the state while systematically integrating it with local networks. Some possible next steps:

  • Include language allowing for data, communications, and broadband strategies to support wildfire data applications in future legislation;
  • Extend towers, cameras, and fixed wireless capacity throughout the state to provide first responders with powerful, contemporary tools;
  • Where wireless towers exist on state property, work with ALERTWildfire to support the installation of cameras and other equipment to expand coverage;
  • Explore opportunities to coordinate this system with FirstNet to augment the reach of this national first-responder network.

In light of the devastating effects of wildfires on California, scaling this work to create a vast data relay mesh across the state, in partnership with first responders, utility companies, and the State, would significantly protect Californians and lead the way for other states that are also fighting fires of unprecedented scale.

This article is available in PDF format for convenient dissemination.

Procedures to Identify and Resolve Location Discrepancies in Eligible Census Blocks Within Winning Bid Areas

Before the

Federal Communications Commission

Washington, DC  20554

 

Procedures to Identify and Resolve Location ) WC Docket No. 10-90 Discrepancies in Eligible Census Blocks ) Within Winning Bid Areas

 

REPLY COMMENTS OF CALIFORNIA INTERNET, L.P. DBA GEOLINKS

 

California Internet, L.P. DBA GeoLinks (“GeoLinks” or the “Company”) submits these reply comments in response to comments filed on the Public Notice released by the Wireline Competition Bureau (“Bureau”) regarding procedures to identify and resolve location discrepancies in eligible census blocks within Connect America Fund Phase II (“CAF II”) winning bid areas on September 10, 2018.[1]

 

  1. INTRODUCTION

Several commenters in the aforementioned proceeding share GeoLinks’ view that the Bureau should create a straightforward process for resolving location discrepancies that may exist in Phase II auction support areas.  GeoLinks believes that such a process is necessary to ensure that CAF II recipients and relevant stakeholders are able to gather and report accurate location-specific data.  As such, GeoLinks makes the following recommendations.

 

  1. DISCUSSION
  2. Prospective Developments

In the Public Notice, the Commission asks whether “actual locations should include prospective developments that have a reasonable certainty of coming into existence within the support term.”[2]  GeoLinks agrees with commenters that ask the Commission not to require CAF II recipients to include prospective developments into the definition of “actual location.”

In both California and Nevada, the states for which GeoLinks has been awarded CAF II funding, there have been many instances where housing developments have been planned, or even started, but then downsized, abandoned, or put on indefinite hold.  While many of these developments do eventually get built, as WISPA notes, there is no guarantee that information regarding new developments will stay constant past the one-year period of determining “locations” or that those plans won’t be modified to increase or decrease the number of housing units, small businesses, etc.[3]  As USTelecom explains, “Providers cannot be omnipresent in local real estate planning over the next year and auditing whether a provider could have, or should have, known about a prospective development would be extremely subjective.”[4] Moreover, other commenters advocate for the Bureau to “permit support recipients to rely on any reasonably current data source” and to avoid “imposing evidentiary burdens beyond those that are strictly necessary.”[5]

For these reasons, GeoLinks urges the Bureau not to requirethat prospective developments be included in the definition of “actual location.”  However, if a CAF II recipient chooses to include prospective developments in its definition of actual locations, GeoLinks agrees with WISPA that it should be allowed to do so if it can provide information to show that specific prospective locations are more likely than not to be constructed and inhabited within the six-year buildout period.[6]

 

  1. Reliability and Validity of Data

In its opening comments, GeoLinks urged the Bureau not to limit broadband providers’ ability to determine what methodology may work best for them to gather information regarding the number of locations within an area so long as the provider can explain that methodology.  This sentiment was echoed by several commenters that offered numerous proposals beyond those methodologies that the Public Notice called “generally accepted.”[7]

US Telecom suggests that providers should be able to rely upon desktop geolocation or automated address geocoding.[8]  WISPA discusses the possibility of aerial imagery (which GeoLinks also suggested in its opening comments) in addition to the possibility of combining the findings from desktop geolocation using web-based maps and imagery with other qualitative criteria such as roof size or other visual evidence.[9] Verizon suggests refining initial analysis with web-based maps or targeted GPS data in the field.[10]  Hughes urges the Bureau to allow recipients to utilize third-party geocoding providers.[11]  Moreover, Commnet, explains that any process to collect required location-specific showings “must account for areas such Tribal Lands where standard street addresses are not available and commercial geocoding data are scant and unreliable.”[12]

GeoLinks believes that the proposal of many different options makes clear that there are many ways for CAF II recipients to verify location data.  So long as a CAF recipient’s selected methodology (or methodologies) can be explained, it should not be precluded from using any reasonable method.  Therefore, GeoLinks continues to urge the Bureau not to limit available methodologies to verify location data.

 

  1. Relevant Stakeholder’s Evidence

With respect to the definition of “relevant stakeholders,” GeoLinks strongly agrees with WISPA that this definition should be limited to individuals, state and local authorities, and Tribal governments, in the relevant supported area.[13]   Additionally, GeoLinks strongly agrees that “the evidence submitted by stakeholders should be the same as is required to be submitted by participants.”[14]  Both GeoLinks and WISPA urge the Bureau to require relevant stakeholders to submit a narrative description of the methodology they used to challenge the location information provided by a CAF II recipient and to certify under penalty of perjury that 1) the location data they are providing is accurate, 2) the stakeholder is located (or represent individuals that are located) within the relevant geographic area, and 3) that the stakeholder is not associated in any way with a competitor.[15]  As WISPA explains, “it should not be sufficient for a stakeholder to solely allegedeficiencies in the participant’s methodology.”[16]

 

  • CONCLUSION

Based on the foregoing, GeoLinks urges the Bureau to adopt the recommendations discussed herein, as agreed to by several parties to this proceeding, regarding procedures to identify and resolve location discrepancies in eligible census blocks within CAF II winning bid areas.

 

Respectfully submitted,

 

GEOLINKS, LLC

 

/s/ Skyler Ditchfield, Chief Executive Officer

/s/ Melissa Slawson, General Counsel/ V.P of Government Affairs and Education

 

November 13, 2018

[1]Public Notice, “Wireline Competition Bureau Seeks Comment on Procedures to Identify and Resolve Location Discrepancies in Eligible Census Blocks Within Winning Bid Areas,” WC Docket No. 10-90, DA 18-929 (rel. Sept. 10, 2018) (“Public Notice”).
[2]Public Notice at 5.
[3]See Comments of the Wireless Internet Service Providers Association, WC Docket 10-90 (filed Oct 29, 2018) (“WISPA Comments”) at 3.
[4]Comments of USTelecom, WC Docket 10-90 (filed Oct. 29, 2018) (“USTelecom Comments”) at 3.
[5]Comments of Verizon, WC Docket 10-90 (filed Oct. 29, 2018) (“Verizon Comments”) at 5 and Comments of Hughes Network Systems, WC Docket 10-90 (filed Oct. 29, 2018) (“Hughes Comments”) at 2, respectively.
[6]WISPA Comments at 3.
[7]SeePublic Notice at 11.
[8]USTelecom Comments at 4.
[9]WISPA Comments at 4-5.
[10]Verizon Comments at 3,
[11]Hughes Comments at 3
Comments of Commnet Wireless, Inc., WC Docket 10-90 (filed Oct. 29, 2018) at 2.
[13]SeeWISPA Comments at 6.  See alsoUSTelecom comments at 5.
[14]WISPA Comments at 7.
[15]SeeWISPA Comments at 6.
[16]WISPA Comments at 7 (emphasis added).

GeoLinks CEO Joins FCC Broadband Committee

GeoLinks Chief Executive Skyler Ditchfield has been appointed to a working group of the Federal Communications Commission’s Broadband Deployment Advisory Committee.

As a member of the Disaster Response and Recovery Working Group, Ditchfield will contribute to recommending measures to improve the broadband infrastructure before disasters happen and to restore them afterwards.

Ditchfield is the only California representative and the only fixed wireless broadband provider in the working group. His Camarillo company is a mid-sized internet service provider.

He was honored and excited to be part of the working group, Ditchfield said, adding that in the past few fire seasons his staff at GeoLinks has gained experience at restoring connectivity during natural disasters.

“I am confident our working group can not only improve the resiliency of broadband infrastructure before disasters occur nationally, but also ensure that connectivity is both maintained and restored as quickly as possible,” Ditchfield said in a statement.

FCC Chairman Ajit Pai Appoints GeoLinks’ CEO Skyler Ditchfield to the BDAC Disaster Response and Recovery Working Group

The panel is tasked with developing best practices to improve broadband outage response caused by local, state, and national disasters

CAMARILLO, Calif.–(BUSINESS WIRE)–On Thursday, November 1, 2018, Federal Communications Commission (FCC) Chairman, Ajit Pai, announced GeoLinks’ CEO Skyler Ditchfield’s appointment to the FCC’s Broadband Deployment Advisory Committee (BDAC) Disaster Response and Recovery Working Group.

As stated in the FCC’s formal news release, The Disaster Response and Recovery Working Group is tasked with recommending measures to improve the resiliency of broadband infrastructure before disasters occur, as well as actions that can be taken to more quickly restore broadband infrastructure following a disaster. The Chairman has also charged the working group with developing best practices for coordination among wireless providers, backhaul providers, and power companies during and after a disaster.

“Broadband communications have become essential to the delivery of life-saving information in a disaster,” Chairman Pai said. “It’s critical to public safety that our broadband networks are as resilient as possible to prevent outages in a disaster and also can be restored as quickly as possible when an outage occurs.”

Led by Chair Red Grasso, FirstNet State Point of Contact for the North Carolina Department of Information Technology, and Vice-Chair Jonathan Adelstein, President & Chief Executive Officer of the Wireless Infrastructure Association, Ditchfield is the only California-based representative, and the only fixed wireless broadband provider in the group.

“I am both honored and excited to be part of this working group,” said GeoLinks’ co-founder and CEO Skyler Ditchfield. “Throughout the past few fire seasons in California, my team and I have gained extensive experience in recovering from and restoring connectivity during natural disasters. During the Thomas Fire, for example, we were able to re-establish services in less than 24 hours, whereas many terrestrial providers remained down for months. From solar and wind powered towers, to backup generators, we also have significant expertise in utilizing alternative power methods, technologies that become critical during catastrophic weather events. Locally, I am actively working on a large-scale, state-wide project that will utilize a multitude of technologies, including mobile relay stations, to create true network resilience, ultimately preserving connectivity during disasters. Every region and every disaster in our country has its own subset of challenges. I am confident our Working Group can not only improve the resiliency of broadband infrastructure before disasters occur nationally, but also ensure that connectivity is both maintained and restored as quickly as possible.”

While the first formal meeting of the group has not been publicly announced, a complete list of members is available at https://docs.fcc.gov/public/attachments/DA-18-1121A1.docx.

For media inquiries or interview requests, please contact Lexie Smith at [email protected].

###

About GeoLinks

Headquartered in Southern California, GeoLinks is a leading telecommunications company and competitive local exchange carrier (CLEC) public utility, nationally recognized for its innovative Internet and Digital Voice solutions. Ranked first in category on Inc. Magazine’s Inc. 5000 Fastest Growing Companies in America in both 2017 and 2018, GeoLinks delivers Enterprise-Grade Internet, Digital Voice, SD-WANCloud On-ramping, Layer 2 Transport, and both Public and Private Turnkey Network Construction expertly tailored for businesses and Anchor Institutions nationwide.

GeoLinks’ accelerated success is largely due to its flagship product, ClearFiber™, which offers dedicated business-class Internet with unlimited bandwidth, true network redundancy, and guaranteed speeds reaching up to 10 Gbps. Named “Most Disruptive Technology”in the 2018 Central Coast Innovation Awards, GeoLinks’ ClearFiber™ network is backed by a carrier-grade Service Level Agreement boasting 99.999% uptime and 24/7 in-house customer support. With an average installation period of 4 to 7 days, GeoLinks is proud to offer the most resilient and scalable fixed wireless network on the market.

Disaster Response and Recovery Working Group

FCC ANNOUNCES MEMBERSHIP OF THE BROADBAND DEPLOYMENT ADVISORY COMMITTEE’S DISASTER RESPONSE AND RECOVERY WORKING GROUP - GeoLinksFCC ANNOUNCES MEMBERSHIP OF THE BROADBAND DEPLOYMENT ADVISORY COMMITTEE’S DISASTER RESPONSE AND RECOVERY WORKING GROUP

Read Official Notice here: https://docs.fcc.gov/public/attachments/DA-18-1121A1.docx.

Released:  November 1, 2018

GN Docket No. 17-83

This Public Notice serves as notice that Federal Communications Commission (Commission) Chairman Ajit Pai has appointed members to serve on the Disaster Response and Recovery Working Group of the Broadband Deployment Advisory Committee (BDAC).  The members of this working group are listed in the Appendix.

The BDAC is organized under, and operates in accordance with, the Federal Advisory Committee Act (FACA).[1]  The BDAC’s mission is to provide advice and recommendations to the Commission on how to accelerate the deployment of high-speed Internet access.[2]

The BDAC’s Disaster Response and Recovery Working Group is charged with making recommendations on measures that can be taken to improve resiliency of broadband infrastructure before a disaster occurs, strategies that can be used during the response to a disaster to minimize the downtime of broadband networks, and actions that can be taken to more quickly restore broadband infrastructure during disaster recovery.  It is also charged with developing best practices for coordination among wireless providers, backhaul providers, and power companies during and after a disaster.

More information about the BDAC is available at https://www.fcc.gov/broadband-deployment-advisory-committee.  You may also contact Paul D’Ari, Designated Federal Officer (DFO) of the BDAC, at [email protected] or 202-418-1550; or the Deputy DFOs Deborah Salons at [email protected] or 202-418-0637, or Jiaming Shang at [email protected] or 202-418-1303

 

MEMBERS OF THE DISASTER RESPONSE AND RECOVERY

WORKING GROUP

 

Chair:

Red Grasso, FirstNet State Point of Contact

North Carolina Department of Information Technology

 

Vice-Chair:

Jonathan Adelstein, President & Chief Executive Officer*

Wireless Infrastructure Association

 

Members:

 

Skyler Ditchfield, Chief Executive Officer

GeoLinks

 

Andrew Afflerbach, Chief Executive Officer and Director of Engineering, CTC Technology and Energy

National Association of Telecommunications Officers and Advisors

 

Allen Bell, Distribution Support Manager, Georgia Power Company*

Southern Company

 

Megan Bixler, Technical Program Manager for Communications Center and 911 Services

Association of Public Safety Communications Officials

 

Patrick Donovan, Senior Director, Regulatory Affairs

CTIA

 

Tony Fischer, Director, Information Technology

City of Germantown, Tennessee

 

Monica Gambino, Vice President, Legal

Crown Castle

 

Larry Hanson, Executive Director*

Georgia Municipal Association

 

David Hartshorn, Chief Executive Officer

Geeks Without Frontiers

 

Greg Hauser, Communications Branch Manager/Statewide Interoperability Coordinator,

North Carolina Emergency Management Division

National Emergency Management Association

 

Kurt Jacobs, Corporate Director, Emerging Technology & Solutions

JMA Wireless

 

Richard Kildow, Director of Business Continuity & Emergency Management

Verizon

 

Frank Korinek, Director of Government Affairs

Motorola

 

Wyatt Leehy, Information Technology Manager

Great Plains Communications

 

David Marshack, Telecommunications Regulatory Lead

Loon

 

Jim Matheson, Chief Executive Officer*

National Rural Electric Cooperative Association

 

Kelly McGriff, Vice President & Deputy General Counsel*

Uniti Group

 

Wendy Moser, Commissioner, Colorado Public Utilities Commission

National Association of Regulatory Utility Commissioners

 

Alexandra Fernandez Navarro, Commissioner

Puerto Rico Public Service Regulatory Board

 

John O’Connor, Director, National Coordinating Center for Communications

Department of Homeland Security

 

Eddie Reyes, Prince William County Emergency Communications Center

National Public Safety Telecommunications Council

 

Rikin Thaker, Vice President, Telecommunications and Spectrum Policy*

Multicultural Media, Telecom and Internet Council

 

Pete Tomczak, Manager, Spectrum Coordination and Clearance

FirstNet

 

Rocky Vaz, Director of Emergency Management

City of Dallas, Texas

 

Joseph Viens, Senior Director of Government Affairs

Charter

 

Debra Wulff, Public Safety Director

Confederated Tribes of the Colville Reservation