Fixed Wireless – Want to stay up-to-date on all of GeoLinks’ latest news and the world of telecom? Check out GeoLinks’ blog by visiting Geolinks.com/NEWS

Posts

GeoLinks Opening Comments on Establishing the Digital Opportunity Data Collection

Before the

Federal Communications Commission

Washington, DC  20554

Establishing the Digital Opportunity Data Collection - GeoLinks

COMMENTS OF CALIFORNIA INTERNET, L.P. DBA GEOLINKS

California Internet, L.P. DBA GeoLinks (“GeoLinks” or the “Company”) submits these Comments in response the Report and Order and Second Further Notice of Proposed Rulemaking issued August 6, 2019 in the aforementioned proceedings.

INTRODUCTION AND SUMMARY

GeoLinks is one of the fastest growing Internet and phone providers in America and the #1 fastest growing fixed wireless service provider in California.  While the Company originally  focused on business and enterprise customers, in 2016 GeoLinks turned its focus to expand its customer base to include unserved and underserved areas throughout California and beyond.

GeoLinks is an advocate for improved broadband availability mapping and commends the Commission on its efforts to modernize its broadband data collection processes.  While the 2nd FNPRM proposes some improvements to how the Commission currently collects broadband data other proposals fail to take into account the fundamental differences that exist between technology types and resources available to small and mid-sized service providers.  GeoLinks presents these comments to provide guidance to the Commission regarding data collection methods that are best suited for collecting fixed wireless broadband availability data.

DISCUSSION

  • The Commission Should Adopt the Safe Harbor Provisions Proposed by WISPA

Fixed Wireless technology is unique.  Because it utilizes direct, line-of-sight connections (from specific point to specific point), some characteristics are similar to wireline technology. Similarly, because it is wireless and does not carry the connection requirements of wireline technology (i.e. physical wires), it also shares many characteristics to mobile wireless.  However, it is neither wireline nor mobile wireless and, therefore, requires broadband reporting processes specifically tailored to account for these differences.

As proposed, the data collection processes set forth in the 2nd FNPRM do not work for fixed wireless providers.  A variety of factors including the location of transmission towers, specific equipment used, available spectrum bands, and line-of-sight from a tower, are all factors that must be considered when engineering a fixed wireless network.  Logically, these factors also come into play when measuring broadband availability and therefore make the creation of a reporting polygon extremely challenging (at least in the form proposed by the 2nd FNPRM). Therefore, the Commission must look to adopt a solution that allows it to obtain the granular data it seeks while accounting for the technological differences of fixed wireless services.

GeoLinks supports the reporting approach previously advocated by the Wireless Internet Service Providers Association (“WISPA”).  As WISPA explained, “in order to fulfill the overall objectives for accurate data for all areas of the country, especially rural areas, modernization must take into account the inherent differences in deployment and technology between wired broadband services and fixed wireless broadband services, as well as recognize and reduce the significant financial burdens on small providers.”  As such, WISPA’s proposal recommended “a two-pronged process to be used by fixed wireless providers to create propagation maps that better illustrate deployment coverage for various fixed wireless spectrum bands.”  GeoLinks believes that this proposed solution strikes the right balance between the Commission’s interest in securing granular broadband availability data and the realities of fixed wireless service and strongly urges the Commission to adopt WISPA’s safe harbor parameters.

One addition to WISPA’s proposal that GeoLinks would suggest is the option for fixed wireless service providers to provide expanded coverage information if service availability areas extend further than the proposed safe harbor parameters.  While GeoLinks believes that the safe harbor parameters proposed by WISPA are generally good measures of the broadband service parameters that will be realized, the Company also believes that in some instances, additional coverage area may be possible.  To ensure the most accurate reporting possible, GeoLinks urges the Commission to adopt the safe harbors in WISPA’s proposal with the option for service providers to provide a more expanded polygon if they so choose.  GeoLinks understands that any polygon areas that fall outside of the safe harbor areas would be subject to additional scrutiny by the Commission.

  • The Commission Should Require Broadband Service Providers to File Corrected Broadband Availability Data with Their Next Reporting Opportunity

In the 2nd FNPRM, the Commission proposes that USAC “ensure that providers refile updated and corrected data in a timely fashion,” and seeks comment on the “appropriate time period (if any) for fixed providers to respond to a complaint.”  GeoLinks agrees that any data provided by a broadband provider that is inaccurate should be corrected.  In the case of fixed wireless providers, any data that a provider chooses to provide outside of established safe harbors could be subject to correction, if inaccurate.  However, GeoLinks urges the Commission not to implement correction timeframes that impose additional burden on service providers.

The Commission should require that any corrected data be submitted with a service provider’s next filing opportunity, per the requirements of DODC.  Broadband reporting efforts are time and resource intensive. This is especially true for small and mid-sized providers that may not have in-house GIS specialists or data analysts dedicated to broadband mapping.  Requiring service providers to incur the cost of filing frequent corrections may result in service providers underreporting broadband availability to avoid filing corrections. Instead, lumping corrections in with the required reporting at a set interval (as proposed in the 2nd FNPRM) will not require service providers to allocate more resources than they already do for these ongoing filings.  Moreover, this will ensure the most accurate data possible is provided at each filing deadline. For these reasons, GeoLinks urges the Commission to allow service providers to correct any inaccurate data with their next filing.

The 2nd FNPRM also asks whether the Commission should require providers to resubmit all earlier datasets for the affected areas to conform to any corrections.  GeoLinks sees no value in resubmitting old data that may be outdated anyway.  First, broadband data for small and mid-sized carriers may change overtime due to customer attrition, changes in equipment used, network updates, etc.  Therefore, past data sets may be different than newly reported data sets and requiring correction could mean re-submitting incorrect data. In addition, as stated above, broadband data reporting is already a time and resource intensive effort for small and mid-sized service providers.  To require submission of new data and old data when an error is found could double or triple the work required for no actual benefit to the Commission’s mapping efforts. Instead, GeoLinks urges the Commission to just require the most correct data be submitted at each reporting deadline and use that data to populate its broadband availability tools.

  • The Commission Should Require Individuals to Provide Proof that a Service Provider Declined to Provide Service Within the Applicable 10-Business Day Period

In the 2nd FNPRM, the Commission proposes to require “that individuals disputing coverage certify that they have requested service from the provider and that the provider either refused, or failed, to provide service within the applicable 10-business day period.”  While GeoLinks believes that certification is a good start, false certifications would be difficult to determine prior to the Commission/ USAC and the subject service provider expending time and resources to investigate.  Therefore, the Company urges the Commission to go one step further and require that disputes not only include a certification but also include proof that the service provider declined to provide service. This could perhaps be in the form of an email from the service provider to the individual, a cancelled service order, or a transcript from a call to customer service.  This proof requirement will help ensure that the Commission/ USAC and service providers are only investigating legitimate disputes. Moreover, this would also help eliminate the risk of malicious challenges via automated tools or bots.

  • The Commission Should Not Require Fixed Broadband Providers to Report Latency Levels

In the 2nd FNPRM, the Commission seeks comment on “whether fixed broadband providers should include latency levels along with the other parameters in reporting their coverage polygons.”  The simple answer is “no” for a number of reasons.

As an initial matter, latency testing is not something commonly done by service providers because it is costly and doesn’t provide valuable data to the provider.  While latency testing is required under CAF, CAF recipients are only required to test a subset of customers, built the costs of such testing into their CAF auction bids, and are receiving high-cost support, in part, to undertake this testing.  To impose it on every service provider for all data provided would be extremely burdensome. Second, GeoLinks fails to see what value this data would be to the Commission to warrant such burdensome testing requirements. Depending on the applicable protocol and engineering of a network, a service provider can provide high speed broadband to its customers and a high-quality user experience even with what may be considered higher latency.  From this perspective so long as the customer is obtaining the speeds they expected, a specific latency measurement is unnecessary. Lastly, latency is not a measure of broadband “deployment,” which the Commission states is “critical to the Commission’s efforts to bridge the digital divide.” Therefore, and for the foregoing reasons, GeoLinks urges the Commission to not impose the burden of latency testing on providers.

CONCLUSION

GeoLinks commends the Commission on its efforts to modernize its broadband data collection processes.  In order to ensure more granular data that takes into account the fundamental differences that exist between technology types and resources available to small and mid-sized service providers, GeoLinks urges the Commission to adopt the safe harbor proposal set forth by WISPA, only require service providers to file corrected data with its next submission opportunity (and only on a forward-looking basis), require proof of service denial in the dispute process, and refrain from requiring service provider to provide latency data that will not improve the Commission’s understanding of the current status of broadband deployment.

 

Respectfully submitted,

California Internet, L.P. DBA GeoLinks

/s/ Melissa Slawson, General Counsel/ V.P of Government Affairs and Education

 

September 23, 2019

 

5 Factors to Consider When Choosing the Best ISP for Your Business

5 Factors to consider when choosing an ISP for your Business - GeoLinks.com

How do you choose the best Internet Service Provider for your business?

With day-to-day business operations becoming increasingly reliant on the Internet, choosing an Internet Service Provider (ISP) is more important than ever. For those lucky enough to be in a market privy to a multitude of carriers, such as Los Angeles or Orange County, there are five primary factors to consider when either onboarding with or switching to a new ISP.

1. Reputation

One of the most reliable ways to vet a new telecom provider is by researching their reputation. Whether that be by reading through online reviews posted on Yelp or Google, or talking directly to the neighboring businesses in your area, understanding an ISP’s reputation is one of the most foolproof ways to really know what you’re signing up for.

That being said, make sure you’re being conscientious of what type of customer the reviews are coming from – i.e. are they from residential customers or from other businesses like yourself.

2. Service Level Agreements (SLA)

A service-level agreement (SLA) is a contractual commitment between an Internet Service Provider and a client. An IPS’s SLA should typically outline guaranteed service metrics such as uptime, latency, jitter, packet loss, and response/repair time. For example, GeoLinks’ SLA is fairly straightforward and offers the following:

SERVICE TARGETS

  • Response Priority: Critical: 4hrs or less
  • Network Quality of Service
  • Network Availability: Target of at least 99.999% uptime
  • Round Trip Latency Under 40ms
  • Jitter under 10ms
  • Packet loss target < 0.1%

On the other hand, some providers don’t offer a guaranteed service level at all. For example, there are a variety of providers who simply state:

“X company does not warrant that the service will be uninterrupted or error free nor make any warranty as to the results obtained from the use of the service. X company does not guarantee connectivity at any time, for any length of time or at any particular speed.”

Therefore, when deciding to onboard or switch to a new Business Internet Service Provider, make sure you carefully research the provider’s SLA so you know what service quality to expect.

5 Factors to consider when choosing an ISP for your Business - GeoLinks.com - Customer Support

3. Customer Support

While in an ideal world businesses would never have to engage with their ISP past service installation, unfortunately, that is just not the case.

Whether a client has billing questions, service issues, needs technical support, has upgrade inquiries or product add-ons, at some point or another, chances are a business will have to engage with an ISP’s customer support team. Therefore, research what type of support the company offers.

A larger carrier, for example, might make you sit through an automated phone menu, place you on a lengthy hold, and eventually transfer you to a contracted employee outside of the U.S. Alternatively, a medium sized ISP, such as GeoLinks, offers 24/7 in-house customer support; customers are even able to ask for customer support reps by name.

Another element to consider is overall responsiveness. If your business does experience a technical issue, how long does it take a provider to respond and address the issue? Time is money, so whether it be hours wasted on hold, or weeks waiting on a repair, how an ISP handles customer relations directly affects its business customer’s bottom line.

4. Agility and Flexibility

As a business grows and changes, its overall telecom needs will as well. For example, if a law firm hires 10 more associates, they will likely need to upgrade their overall bandwidth. Furthermore, if juggling multiple carriers and multiple bills becomes too large of a strain on a company’s accounting apartment, a business may wish to streamline all their telecom needs with a single carrier.

Some ISPs offer additional services such as VoIP and SD-WAN, while others do not. Therefore, when selecting an ISP, make sure to explore their entire product suite and offerings. Choosing an aggregator, (an ISP that is capable of reselling multiple ISP products and services) such as GeoLinks, ensures that no matter the growth or changes in a business, a single provider will be able to upgrade and adapt to evolving business needs.

5 Factors to consider when choosing an ISP for your Business - GeoLinks.com - Bandwidth

5. Bandwidth Availability

Do you know how much Internet bandwidth your company needs? If not, check out “Your Guide to Determining Bandwidth Requirements.

While it may seem obvious, when choosing an ISP, it’s necessary to ensure they can provide the speeds your company needs. Based on your location, and the type of Internet access you are looking for (i.e. Fiber vs Fixed Wireless vs DSL etc.) bandwidth availability may fluctuate from carrier to carrier. Furthermore, if it does appear the ISP offers what you are looking for, make sure you understand if it is a dedicated or shared circuit, as this too will impact the reliability and consistency of speeds.

Feeling overwhelmed? If you’re struggling to decide which ISP is the best for your business, consider contracting an IT consultant. Typically vendor agnostic, IT consultants are able to evaluate all the carriers available in your area, evaluate the above criteria, and present a business with its best option.

Curious if GeoLinks is the right ISP for your business? Call and talk to a Client Consultant today!

Round Up – Industry Experts share their 2019 Telecom Predictions

From the emergence of fixed wireless and hybrid networks, to the predictive realities of 5G, telecom experts share their 2019 industry forecasts.

Please note that the views and opinions expressed in this article do not represent nor do they imply endorsement of my personal views or my employer’s views and opinions. They are unique and independent to the individual contributors listed as the statement’s source.

__

From the roll out of new Artificial Intelligence (AI) integrations, to the highly anticipated future of 5G, in 2018 we saw the telecommunications industry generate some pretty innovative trends and thought-provoking headlines. With the new year just around the corner, I thought I’d turn to a variety of diverse industry experts to learn about their 2019 telecom predictions. Here is what they had to say:

There will be a lot of providers deploying 5G, but monetization will prove a challenge

Nathan Rader, Director of NFV Strategy, Canonical

There will be a race to see who can market 5G the quickest and who will have it as standard first. We’re already seeing tests from multiple providers across the world in isolated areas, and the speed and size of rollouts will only increase as providers look to gain the upper hand.

However, this race could be a costly one. Consumer need for 5G isn’t as great as it was for previous generations. 4G can handle most consumer use cases (such as streaming, gaming, browsing etc.) fairly comfortably with reasonable speed.

5G’s main benefit is providing increased capacity, not speed and latency, making it more of a technical development. Being the first 5G standard network will be a marketing coup, but may not come with the consumer kudos and demand it once did.

Further widespread adoption of Fixed Wireless

Phillip Deneef, Chief Strategy Officer, GeoLinks

We’ve seen fixed wireless technology evolve and improve drastically over the last decade, concurrently beginning to debunk “wireless anxiety”. During the Federal Communications Commission’s (FCC) CAF II Auction in 2018, we saw federal acceptance and adoption through the distribution of significant funding to WISPs, such as GeoLinks. This culminates to my prediction that in 2019 I believe we will see a drastic spike in both businesses and community anchors being connected via fixed wireless. While I do think fiber will still remain top of mind for many key stakeholders, I foresee anchors, rural health care facilities as a specific example, better understanding that EoFW is the most cost effective and time efficient way to get these critical care facilities the speeds they need. Taking guidance from both the FCC and overall industry adoption, on a state level I predict that those governing RFP fund distributions will also be more open to fixed wireless solutions. This will directly result in the United States making substantial strides in closing the digital divide.

Competition in Hosted VoIP market will heat up

Marc Enzor, VoIP Consultant & President, Geeks 2 You

Hosted VoIP phone systems are the hottest thing right now in telecom. Even the SMB and Medium size businesses are starting to become aware of what it is, and to gravitate towards it. In years past, we would spend most of our sales pitch educating customers as to what it is, how it works, and why they should use it. In recent months, customers already are aware and ready to purchase immediately. The sales cycle went from multiple meetings to single meetings now. It has become one of the hottest products we sell.

Going into 2019, it’ll only become even more “standard knowledge”, which means the competition in the hosted VoIP market will heat up. I predict several of the biggest names will start to buy the competition out and a true industry leader will emerge. This will have to happen as the top companies now will start to rely on their current growth models and will need to find ways to replace the lost growth as competition gets bigger.

Only edge computing / edge networking and AI will show true growth

Alan J Weissberger, ScD EE, IEEE Communications Society, techblog.comsoc.org

Only two areas in the telecom/networking space deserve the attention they are getting: 1] edge computing/edge networking and 2] Artificial Intelligence (AI).
Edge computing/edge networking is needed to off load the processing that takes place in cloud resident data center computers and also to reduce latency for critical real time control applications (especially for IoT).

AI and deep learning will be embedded into software-defined architectures in telco networks and the cloud to do analytics, predict failures, and move a lot of the human manual processes into automated operations. The long-term goal is to move from automated elements to closed loop automation and finally to autonomous control of networks.  I believe AI will be critically important to progress emerging telecom services and enabling new ones.  Examples include: 5G, Industrial IoT, autonomous vehicles, Augmented Reality/Virtual Reality, etc.  It will be also very useful for more mundane things, like keeping up with WAN and Internet bandwidth demands due to increased video streaming by cord cutters and pay TV customers (like this author) that increasingly stream sporting events (e.g. MLB TV, NBA League Pass, NHL Center Ice, boxing, etc).

All the other new technologies are hyped to the infinity power and headed for a train wreck.  That’s especially true of 5G, but also includes “Software Defined” networks (SDN and SD-WAN), Network Function Virtualization (NFV), and LPWANs for the Internet of Things (IoT).  All those suffer from the lack of inter-operability which is due to either the lack of standards, too many specs/standards (LPWANs) or proprietary implementations (e.g. SDN from AT&T, Google, Amazon, Microsoft, etc are not interoperable with each other. They each use different specs, with many being proprietary while others are based on open source software).  None of them will make much money for any company in the coming year.  Perhaps a few SD-WAN start-ups will be acquired and investors in those companies will profit, but that’s about it.

Enterprises cut the cord with LTE

Hansang Bae, CTO, Riverbed

For enterprises, 2019 isn’t a forecast of all doom and gloom. The year will also bring freedom from the persistent “last-mile” telecommunications problem. With the advancements in LTE, the technology will finally reach a point where the physical cables that connect end-users to their Internet Service Providers will no longer be a necessity — or a handcuff to a particular ISP.

The “last-mile” problem has long been the most critical and most costly component of an ISP’s network, as well a speed bottleneck. But now, on the heels of widespread adoption, LTE will allow enterprises to forego the last mile of physical cable for a reliable and robust connection.

Purpose-built Security Software will emerge

Don Boxley, Co-Founder and CEO, DH2i

Making smart products, IoT devices, is the new product differentiator — today, even ovens have IP addresses. Companies that have been investing in IoT initiatives understand that the IoT gateway layer is the key that unlocks a high return on those IoT investments. IoT gateways manage device connectivity, protocol translation, updating, management, predictive and streaming data analytics, and data flow between devices and the cloud. Improving the security of that high data flow with a Zero Trust security model will drive enterprises to replace VPNs with micro-perimeters. Micro-perimeters remove an IoT device’s network presence eliminating any potential attack surfaces created by using a VPN.

Likewise, many organizations are pursuing a hybrid strategy involving integrated on-premises systems and off-premises cloud/hosted resources. But traditional VPN software solutions are obsolete for the new IT reality of hybrid and multi-cloud. They weren’t designed for them. They’re complex to configure, and they give users a “slice of the network,” creating a lateral network attack surface. A new class of purpose-built security software will emerge to eliminate these issues and disrupt the cloud VPN market. This new security software will enable organizations to build lightweight dynamic micro-perimeters to secure application- and workload-centric connections between on-premises and cloud/hosted environments, with virtually no attack surface.

Hybrid Networks become more common

Louis Fox, CEO & President, CENIC

In terms of widespread internet connectivity, the low-hanging fruit has long been picked. To achieve a complete mesh across the state, and thereby to include all of our communities and lift all boats, private-sector technology companies will need to work more collaboratively with government and nonprofit community organizations to approach an underserved geographic region with a comprehensive strategy that stitches together fiber, fixed wireless, unlicensed spectrum, TV whitespace, and more. We can no longer deploy in a series of one-offs if we are ever to serve some of the hardest to reach places.

More Internet Networks deploying IPv6

John Curran, President and CEO, ARIN

The Internet has grown remarkably over the past few years and as a result we now have over four billion people online. The Internet will continue to grow at a remarkable pace to meet the requirements of broadband, mobile, and Internet-of-Things (IoT) growth, and this will only increase pressure on Internet Service Providers (ISPs) to deploy the next version of the Internet Protocol (IP version 6, or IPv6) — just as many broadband and mobile providers have already done today. The good news is that the IPv6 transition happens in the “lower layers” of the Internet, so this behind-the-scenes upgrade to the Internet will continue to happen without any noticeable change for Internet users.

Public and Private Clouds will be much more accommodating of each other

Jai Menon, Chief Scientist and IBM Fellow, Cloudistics

[In 2019] only about 5 viable general-purpose public cloud vendors will survive. This is because successful public cloud vendors will need to spend a lot of money, and few can afford to spend as much as the Top 2 — AWS and Microsoft Azure. [Furthermore] Public and private clouds will be much more accommodating of each other. More and more of the services provided by a public cloud vendor, such as their AI services, will become accessible to apps running elsewhere, including on private clouds. At the same time, there will be more and more examples of private cloud capabilities extended to the public cloud — such as VMware Cloud on AWS. Finally, federated orchestration and management of workloads across private and public clouds, all from a single, easy to use, portal will be commonplace.

Political turbulence and possible decrease in network investment

John Windhausen, Executive Director, Schools, Health & Libraries Broadband (SHLB) Coalition

2019 could be a turbulent year in the telecom/broadband space. If the FCC approves the proposed Sprint/T-Mobile merger, and if the court allows the AT&T-Time Warner merger, that could encourage even more consolidation in the marketplace. Of course, more consolidation among big players also opens up more opportunities for smaller, more nimble players to increase their market share. But there are increasing signals of an economic slow-down in 2019, which could mean belt-tightening and reduced investment by all players. The tariffs on Chinese-made equipment could mean increased prices for telecom gear, which could also lead to a pause in network investment. These trends may give a boost to the idea of a grand broadband infrastructure spending bill that both the President and Hill Democrats are trying to get in front of (assuming the government shutdown does not ruin the chances of bipartisan agreement forever.) Such legislation would only have a 30% chance of enactment but could be exciting to watch, as there are so many industry players that could benefit from government funding, especially in rural markets. I expect net neutrality to continue to percolate because the court is likely to remand the case to give the FCC another chance to justify its decision. Congress could and should step in, but there is no sign of compromise on the issue and likely will remain gridlocked. For anchor institutions, work will continue to get the E-rate and Rural Health Care programs running smoothly, but I do not anticipate major structural changes.

Do you agree or disagree with any of the above predictions? If so, feel free to visit the original article here, and leave a comment.

California’s Research Network Connects Science and Community

Louis Fox, CENIC CEO

By Susan Rambo. CENIC — the Corporation for Education Network Initiatives in California — wants to connect the state of California into one giant wireless mesh network. With 20 million users, non-profit network operator CENIC (pronounced “scenic”) may be in a good position to build that network. But they aren’t doing it on their own. Far from it.

CENIC is part of a large community of public and private entities working to improve connectivity throughout California, an effort that has links to national and international projects. It all started with — and is grounded in — researchers. CENIC is governed by its charter members, California’s research institutions.

Since 1997, CENIC has provided networks for those researchers. Now with over 8,000 miles of optical fiber, the nonprofit operates the high-capacity network fabric for California research institutions, California Research and Education Network (CalREN). The fabric consists of broadband connections, upon which last-mile wireless can be added if needed. Eventually that last mile may include 5G wireless technologies.

CalREN offers 100 gigabit Ethernet (GbE), mostly via dark fiber, to researchers in California public and private research institutions (Stanford, California Institute of Technology, University of Southern California, University of California). State universities, community colleges, K–12 were added to the network in the early 2000s, followed by public libraries and cultural assets. CENIC aims for a minimum of 1 Gbps symmetrical regardless of fiber or wireless on any connection it provides

The high bandwidth is important to researchers who need to move data — lots of data.

“An awful lot of data is being collected by sensor nets and other kinds of data-intensive scientific tools. Historically [researchers] had to use sneakernet to get at the data,” CENIC’s President and CEO Louis Fox told RCR Wireless News. Now researchers have CalREN, which provides high-bandwidth connections.

“Where possible we’ve made fiber connections and in other cases we have worked with wireless providers to get fixed wireless and high-bandwidth fixed wireless to the sites,” said Fox. “We try and get as much bandwidth as possible.”  

CENIC typically asks for symmetrical bandwidth.

“Where possible a minimum of one gig symmetrical is our goal. It isn’t always possible in some of these sites because they’re rural and remote and we’ve worked in particular with GeoLinks — a very innovative private sector fixed wireless provider,” said Fox. 

The research platforms extend beyond California’s borders. The National Science Foundation recently funded Science DMZs — networks for Big Data transfers from supercomputers. The NSF is funding Pacific Research Platform (PRP), through UC San Diego and UC Berkeley.  Fox agrees that CENIC’s PRP is a testbed for other Science DMZs throughout the country.  

“We’re part of a conversation that involves other regions of the country that are beginning to roll out what was done here in California,” said Fox.  

CENIC also collaborates with the Energy Sciences Network (ESnet), run from LBNL (Lawrence Berkeley National Laboratory), which connects to 40 Department of Energy sites. On a larger scale, CalREN is part of Pacific Wave — an international collaboration to connect researchers around the Pacific Rim. CENIC’s CalREN networks also work with Internet2 (which runs the national backbone network) and Pacific Northwest Gigapop, nonprofits that both serve networks of researchers and educators. CENIC also supports California Telehealth Network and fire and safety initiatives and research throughout the state. 

CENIC also supports the efforts of California Cities Data-Sharing Project, and the Big Data, Big Cities Initiative, for connecting California cities.

Rural, farming communities 

Bringing more people access to the network, including rural communities, is a goal for CENIC, although not an official mandate. The nonprofit helps bring better internet access to rural and remote parts of California.

“There are these tremendous opportunities for being part of this new economy regardless of where you are. When we’re talking about the rising generation, the goal is to ensure that all Californians have access an opportunity,” said Fox, adding, ”we work with our carriers both wireless and terrestrial to do last mile connections to schools, to libraries and to community colleges.”

Proving the demand in rural areas starved of wireless Internet access, Fox and U.S. Department of Agriculture’s broadband analyst Robert Tse, who spoke with RCR Wireless News recently, report seeing people in rural areas outside public libraries in lawn chairs, on the library steps or in their cars after the libraries were closed, accessing the library’s wireless broadband connections.

“It’s such a critical resource for communities,” said Fox.

Connecting farmers and rural underserved populations may go hand in hand. CENIC is working with UC ANR (University of California’s Agriculture and Natural Resources division) to improve the abysmal connections at the nine UC ANR extension centers where field research is done on crops. A recent boost of the fiber capacity and a low-cost addition of a wireless network in a field at UC ANR’s Kearney research area near Fresno has Kearney researchers thinking they could use the connected field as demonstration for a nearby rural town to get it connected at low cost.

“We’ve moved into this whole arena of wireless extensions of the backbone network for three main areas,” said Fox. Connecting the community through libraries and schools is one. Second is helping researchers work on emergency systems such as fire and earthquake warnings. Third is precision agriculture. “That’s where UC ANR comes in,” he said.

For farmers, all the sensors and data need to be collected and processed.

“Those sensors need direct access to a network so that both researchers and farmers can have immediate access to the data and then subsequently to the analytic tools which make sense of that data,” said Fox.

Right now CENIC is mostly broadband, using fiber.

“Historically, we have focused on terrestrial infrastructure. We run a pretty significant broadband backbone with multiple hundred gigs connecting roughly 12,000 institutions in California,” said Fox. With the help of GeoLinks, a private company and like-minded partner, CENIC is adding wireless to the last mile of their fiber networks. “GeoLinks is a very innovative private sector fixed wireless provider,” said Fox.

Fox hesitates to embrace the hype around 5G.

“I don’t really know about the applicability of 5G for these at least initial precision agriculture applications. … As for technology, we only want the one that works best for the occasion. Right now, for us it’s been a big step to get into fixed wireless and again we don’t we run a fiber network. We work with either the researchers or with the private sector to connect them via fixed wireless. They connect to the nearest point of presence on our network.”

How it started 

“We wanted to smash distance and we wanted to smash time,” said Stuart Lynn, the CIO for the UC system in the 1990s, in a video (see below). “We wanted to break those barriers down to facilitate really effective research and educational collaboration.” 20 years ago, Lynn wanted to tie all the California university networks together in a high-quality, private network.

The U.S. National Science Foundation (NSF) originally funded the networks for California universities Caltech, Stanford, University of Southern California, and the University of California in 1996-1997. NSF continued to fund a network through CSU that eventually because the CalREN NOC in 1999.  “What’s great about [CalREN] is you’re connected to a regional national and international fabric of research networks,” said Fox. “That allows access to data for scientific instruments and to scientific and agricultural collaborations across that fabric and it’s a dedicated fabric for research. So that means that your data doesn’t have to transit the commercial Internet. You’re able to use this regional, national, and global fabric.”

On-fire examples of network use 

CENIC recognizes accomplishments from the projects and systems researchers and government officials devised using the network.

Fire-related works using the CENIC network are HPWREN, an effort of UC San Diego and Scripps Institution of Oceanography.

”They have really created a wireless mesh in San Diego County that is absolutely critical for those communities, particularly around wildland fire and especially to give first responders situational awareness of what’s going on with the fires,” said Fox.

Alert Tahoe is a similar effort in Northern California led by University of Nevada Reno, which puts sensors, high-def cameras and instruments around Lake Tahoe.

“They have dealt with literally hundreds of fires,” said Fox.

Project Wifire, run by U.C. San Diego, uses San Diego’s supercomputing center to collect data on what wildfires do, using ground telemetry, weather data and satellite data the system collects. The supercomputer produces predictive analytics about how newly started fires will spread, which can help with evacuation and firefighting.

“It is increasingly a critical tool because when you understand that for your first responders, for instance, the tool is surprisingly accurate,” said Fox.

“California stands as a test effort for a civic research platform and the testbed for a lot of the other community efforts that CENIC and others are involved in,” said Fox. “There’s an incredible collegial and collaborative spirit between and among groups focused on broadband access… there’s a real esprit — a desire to figure out how to solve these problems, which are not easy ones for a lot of these communities because they have small populations, they are dispersed and investments in infrastructure are pretty complex.”

Despite being the 6th largest economy of the world, in California “it’s not easy for a commercial entity to see a return on investment that requires pooling resources. Pooling subsidies are very community-specific kinds of solutions and projects for addressing these disparities across California,” said Fox. “There’s a sort of can do attitude here that I think sets the stage for what’s possible elsewhere in the U.S.  I’ve done this kind of work in a lot of other states and other countries but there is this indomitable spirit here. And collectively we will figure this out.”

“I encourage continued work [on] this idea of just making the entire state of California one gigantic wireless mesh,” said Vint Cerf, Internet pioneer, at CENIC’s conference in March.

This article originally appeared in the RCR Wireless News, July 10, 2018, and is re-posted with permission in the UC IT Blog.

The future has arrived; it’s Smart, and we’re not ready for it. Here’s why.

Smart City Technology- Lexie Smith - GeoLinks

Read the original article on Medium.com

From Washington D.C., to the coast of California, “Smart City” is, and was, perhaps 2018’s most prominent buzzword, aside from “5G”, circulating nearly all tech, economic, and broadband related conferences and forums. While the exact definition of what really is a “Smart City” varies by person and party, the concept itself is based on the integration of Information and Communication Technologies (ICT) and the Internet of things or (IoT), to optimize city-wide operations, services, and ultimately connect to citizens.

While some of the general public still think of this concept as far off, the reality is that “Smart Cities” have already began materializing across the country. Thus, this glorified digital future is here, and guess what America, we’re not ready.

Why Not?

Well, it’s simple really. Cities and its citizens can have all the ICT or IoT devices they want, but in order to make a city smart, these systems and gadgets have to physically work. That’s where connectivity comes into play. To fuel a Smart City, you need to have broadband Internet access with enough bandwidth to support electronic data collection and transfers. According to the Federal Communications Commission’s (FCC) 2018 Broadband Deployment Report, upwards of 24 million Americans still lack access to high speed broadband. Furthermore, the report states that approximately 14 million rural Americans and 1.2 million Americans living on Tribal lands still lack mobile LTE broadband at speeds of 10 Mbps/3 Mbps. Finally, only 88% of American schools were reported to meet the FCC’s short-term connectivity goal of 100 Mbps per 1,000 users, and only 22% of school districts met its long-term connectivity goal of 1 Gbps per 1,000 users.

On December 4th, the New York Times released an article titled, “Digital Divide Is Wider Than We Think, Study Says” that refuted the FCC’s published report. Based on a study conducted by Microsoft, the article summarizes that researchers concluded “162.8 million people do not use the internet at broadband speeds… In Ferry County, for example, Microsoft estimates that only 2 percent of people use broadband service, versus the 100 percent the federal government says have access to the service.”

So, regardless of which multi-million statistic we conclude is more legitimate, while many metro areas may have the bandwidth needed to at least partially move forward into the next digital revolution, there are still millions of Americans who would, as it stands, be left behind. This reality, coined the digital divide, is the ultimate Smart City roadblock.

Why being hyper fiber-minded is our fatal flaw:

States and communities across the country advocate that pervasive fiber network expansion is the solution to closing the divide. And yes, fiber networks can be great. The reality is, however, that building out fiber infrastructure to every location in America is time-consuming, tedious, and prohibitively expensive. Therefore, deploying fiber does not make economic sense in many rural and urban areas of the country. The Google Fiber project serves as a prime example of this.

To summarize, Google officially launched its Google Fiber project in 2010 with more than 1,100 cities applying to be the “First Fiber City.” By 2011, Google announced it selected Kansas City, Kansas as its target pilot. Fast-forward to 2014, and Google missed its projected city-wide connection deadline in Kansas claiming delays. By 2016, Google publicly commented that all-fiber build outs are proving infeasible due to costs and varying restrictive topologies, consequently filing with the FCC to begin testing wireless broadband internet in 24 cities. Within a few months, they officially acquired a wireless broadband provider and formally announced fixed wireless as part of their Google Fiber network moving forward.

All in all, this case study demonstrates first-hand that to actually close the U.S. digital divide our country must adapt a technology-agnostic mind-set and implement a hybrid-network approach that utilizes whatever technology or technologies makes the most sense for a particular region. Technologies like Fixed Wireless, TV Whitespace, 4G, and Fixed 5G, all have their place, alongside Fiber, in closing the divide. Unfortunately, until those in positions of influence are able to open their minds to these alternative methods, America will remain unconnected.

Who are people in positions of influence?

Luckily, our current FCC administration seems at least semi-understanding that fiber isn’t a “one-size fits all solution”; demonstrated in the recent distribution of funding to WISPs in the CAF II Auction. However, many state and local governments remain less progressive. At a recent California Emerging Technology Fund (CETF) meeting in Sacramento, for example, a large majority of key broadband stakeholders and municipalities advocated that the California Department of Transportation’s (CALTRANS) future infrastructure plans should be wholly fiber-based to support the future of Smart Cities and Autonomous Cars. Whether it be from a lack of education, poor past experiences, or simply riding the buzzword bandwagon, until government organizations can push past common misconceptions that fiber is the only answer, community businesses and residents will be left in the divide.

So, what’s the “Smart” thing to do now?

For those cities in America already connected with reliable multi-gig Internet, go ahead, smart things up! Just keep in mind, to remain a Smart City, even fiber-rich metros will eventually need to extend current network infrastructure to new end points such as light poles, unconnected buildings, and future city expansions.

Ultimately, if we want to collectively prepare for this new revolution, we need to first focus on closing the digital divide. First comes broadband, then comes innovation, then comes the utopian idea of not only Smart Cities, but a smart country.

Smart City - Lexie Smith - GeoLinks

Related Suggested Articles:

Five Crucial Steps Needed To Close The U.S. Digital Divide

Grow Food, Grow Jobs: How Broadband Can Boost Farming in California’s Central Valley

Digital Divide Is Wider Than We Think, Study Says

How Community Anchor Institutions Can Help Close the Digital Divide

Rural service is key to bridging the digital divide

Comments to Consider Modifications to the California Advanced Services Fund

BEFORE THE
CALIFORNIA PUBLIC UTILITIES COMMISSION

Order Instituting Rulemaking to Consider
Modifications to the California Advanced Services Fund.
Rulemaking No. 12-10-012 (Filed October 25, 2012)

 

OPENING COMMENTS OF CALIFORNIA INTERNET, L.P. (U-7326-C) DBA GEOLINKS ON PROPOSED DECISION OF COMMISSIONER GUZMAN ACEVES IMPLEMENTING THE CALIFORNIA ADVANCED SERVICES FUND INFRASTRUCTURE ACCOUNT REVISED RULES

November 29, 2018 

Pursuant to Rule 14.3 of the Commission’s Rules of Practice and Procedure, California Internet, L.P. (U-7326-C) dba GeoLinks (“GeoLinks” or the “Company”) respectfully submits these comments on the on the Proposed Decision of Commissioner Guzman Aceves, entitled “Decision Implementing the California Advanced Services Fund Infrastructure Account Revised Rules” (“Phase II PD”), released on November 9, 2018.

GeoLinks limits these comments to one section of the Phase II PD regarding the Ministerial Review process (Section 2.3). In the Phase II PD, while the Commission acknowledges GeoLinks’ concerns regarding the lack of technology neutrality present in the proposed ministerial review process with respect to the maximum price per household for fiber projects vs. fixed wireless projects, the Commission fails to actually make the process technology neutral. Specifically, while the Phase II PD does lower the maximum amount per household eligible for ministerial review for fiber projects (from $8,000 to $6,000 per household), the number is still inextricably several thousand dollars more than the threshold for fixed wireless projects ($1,500 per household).

The Phase II PD fails to provide any rationale for the thresholds proposed or even attempt to explain why the proposed fiber threshold is $4,500 per household higher than the proposed fixed wireless threshold. GeoLinks assumes these numbers are based on averages taken from previously-approved CASF projects, but this is not clear. For example, while the CASF Annual Report for 2016 explains that the average of 15 CASF fiber projects is $9,442, inclusive of middle mile costs, the Phase II PD does not address this average in any way, explain how the new $6000 may or may not be related to it. The Phase II PD is completely silent as to how the proposed thresholds were conceived, what they may or may not be based on, or why they can’t be the same for both technology types.

Moreover, while the Phase II PD does note that the ministerial thresholds do not preclude fixed wireless projects from being awarded grants that fall outside the ministerial cost criteria, it makes very clear that these projects (even if still significantly less per household than proposed fiber projects that may offer the same speed to the same areas) must go through the Commission’s Resolution process (which is presumably longer and requires a Commission decision). GeoLinks asserts that 1) creating separate thresholds for separate technologies that offer the same service, 2) requiring one technology to endure a procedural process that another would not for what might otherwise be an identical proposed project, 3) and failing to provide any explanation for why the cost threshold or the path to approval is different for one technology over another are examples of bad public policy. In all, the Commission’s retention of differing thresholds for fiber projects vs. fixed wireless projects in direct opposition to the Commission’s goal of administering the CASF program on a “technology neutral” basis and should be rejected.

GeoLinks urges the Commission to create one ministerial threshold for all technology type. Specifically, GeoLinks suggests $4000 to create some balance between the currently inequity of $6000 (fiber) vs. $1500 (fixed wireless).

Respectfully submitted,

/s/ Melissa Slawson
Melissa Slawson
General Counsel, V.P. of Government Affairs and Education
California Internet, L.P. dba GeoLinks
251 Camarillo Ranch Rd
Camarillo, CA 93012

November 29, 2018

[1] California Advanced Services Fund: A Program to Bridge the Digital Divide in California, Annual Report January 2016 – December 2016 (issued April 2017) at page 43, FN 51.
[1] Interim Opinion Implementing California Advanced Services Fund, Decision 07-12-054 (rel. December 20, 2007), at 8: “The CASF shall be administered on a technology neutral basis by the Commission.”  See also Id. At 28: “CASF funding proposals will be reviewed based upon how well they meet the criteria for selection as set forth below, and, where applicable, compared with any competing claims to match the deployment offer under superior terms. Such criteria should be evaluated on a competitively neutral basis.” (Emphasis added).

Wireless Smart Farming to Keep Frost Away From Citrus

UCSB SmartFarm sensor approximately 5 feet off the ground surrounded by citrus will help UC researchers know when to turn on windfans to protect plants from frost.

By Susan Rambo.

Computer science researchers from the University of California, Santa Barbara, are using the internet of things to prove that smart farming can be a farm implement as basic as the tractor and plough.

The husband and wife team of Chandra Krintz and Rich Wolski, both UCSB computer science professors, think data analytics can help tackle some of the tough challenges of modern agriculture. They want to apply the predictive mathematical leaps used in modern internet commerce to predict what people will buy, to agriculture. The pair created the UCSB SmartFarm program in response to what they see as the main issues of agriculture.

Krintz and Wolski cite U.S. Department of Agriculture and United Nations Food and Agriculture Organization studies that say some scary stuff: increasingly more food is needed to feed the growing global population, and yet farm labor is in short supply or too expensive. Eighty percent of the fresh water and 30% of global energy is used to produce food, half of which we waste in spoilage. Farming also has some particularly tough foes: Pests and disease attack farms’ output and farm land is subsiding (sinking) — especially in California — because of groundwater overdraft. On top of all that, agriculture makes 22% of greenhouse gases.

The only way smart farming can make a dent on those issues is to attack specific problems. For Krintz and Wolski’s first test projects, they talked to the farmer — in this case, farm researchers — first before designing a system. Although almost every ag tech pitch begins with a summary of those issues, the UCSB computer scientists’ approach is to come up with scientifically vetted data about the usefulness of cloud and data analytics in farming.

The design parameters of behind UCSB SmartFarm’s Farm Cloud System is to make a system a farmer could love: it should be easy to use and work reliably, cheaply and privately — farmers don’t want their data accessible. The system needs to provide useful data to help increase yield, automate farm operations or save money (or all three), and the data must be available real time. The whole thing has to work without IT staff.

The self-managing system needs to work like an appliance, like your refrigerator, write Krintz and Wolski in a presentation about the project.

Krintz and Wolski are testing the system on nut trees at Fresno State and on citrus at the University of California’s Lindcove Research and Extension Center (LREC) near Visalia, Calif. The UCSB SmartFarm program has support from Google, Huawei, IBM Research, Microsoft Research, the National Science Foundation, National Institutes of Health and the California Energy Commission.

RCR Wireless News visited the LREC — a literal test bed for citrus and smart farming — and got the full tour of the UCSB’s Farm Cloud System.

Lindcove’s research mandate

The public is probably not aware that agricultural research centers, such as LREC (Lindcove), do the hard science that protects our food. In the case of Lindcove, hard science is the study of mostly citrus trees, and it means the grueling work of studying each tree.

Dr. Beth Grafton-Cardwell, research entomologist, an integrated pest management (IPM) specialist and Lindcove’s director remembers sorting fruit by hand.

“When I first started in 1990, if we harvested in January, we would stand in the field in our long underwear and they would pick fruit into a bin and we would have ring sizers that told us what size the fruit was. We would count the fruit and size the fruit and write it on a clip board on a piece of paper,” she said. “Now this machine can do this better.”

Standing near a huge packing line machine that dwarfed her, Grafton-Cardwell explained how the cameras and the extra sensors enable the machine to size and weigh the fruit, examine the outside of the fruit using three types of cameras and estimate the sugar levels inside. One piece of fruit goes through the machine at a time, for scientific purposes, which differs from how a normal packing house operates.

“If I am a researcher, each of my trees is a replication and a different situation, so I want to know everything there is to know about the fruit on that tree,” said Grafton-Cardwell. The cameras take about 30 photographs of each piece of fruit, rotating the fruit as they go. Every parameter from each piece of fruit is put into a spreadsheet: “We know the size, the shape, if it has scarring, the precise color,” said Grafton-Cardwell.

The growers paid for Lindcove’s packing line. “We can simulate anything you want to do on a commercial pack line,” said Grafton-Cardwell. Most packing houses have these machines but don’t use them the way researchers do. They use them for sorting fruit, not for collecting the precise data the researchers need.

“You have to train the machine to the colors and the blemishes. It can get overwhelming,” said Kurt Schmidt, Lindcove’s principal superintendent of agriculture. “We can slow everything down and gather an infinite amount of data.”

“The data sets are ginormous,” Grafton-Cardwell pointed out. Data and an interpretation of the data is the really the product that Lindcove produces.

Originally started in 1959 by University of California Riverside and San Joaquin Valley citrus growers, Lindcove helps growers try out treatments and crop varieties without experimenting on their own crops, which protects their orchards — and livelihood. “Researchers from around the state can come here and do experiments,” said Grafton-Cardwell. Lindcove focuses on creating new varietals and demonstrating gardens of hundreds of citrus — a demo garden that is repeated in several other locations, such as the desert, for comparison. The center is working on 30 research projects right now.

“Citrus grows quite easily statewide….there are 300,000 acres [planted] statewide. It’s all fresh market, [California growers] don’t do juice. If the growers produce for juice, they lose money,” said Grafton-Cardwell. Florida and Brazil are the juice producers.

“Their climate produces a better juice fruit,” said Schmidt.

Lindcove is one of nine research centers in the University of California’s Agriculture and Natural Resources (ANR) department. With soil and climate typical for the commercial citrus growing in the Central Valley of California, the Lindcove’s 175 idyllic acres may be tucked remotely against the Sierra foothills on the road to Sequoia National Park, but it’s on the forefront of fighting some pretty scary citrus pests.

The Huanglongbing (HLB) bacterium has the citrus industry in California in an increasing panic. This bacterium, spread by the Asian citrus psyllids, a small bug imported from Asia, has already made its way up through Mexico and is now in Southern California and spreading northward.

Huanglongbing, also known as citrus greening disease, is killing trees at alarming rates and there is no cure yet. “It has devastated Florida. Huanglongbing has knocked their acreage down by 50 percent,” said Grafton-Cardwell. “We are trying to get some proactive research going to prepare for the arrival of the disease in the commercial citrus. Right now it is just in residential backyards, but it is going to get to the commercial citrus in the near future,” said Grafton-Cardwell.

In California, it is particularly hard to control because of the prevalence of backyard citrus trees.

“Right now it is just in Southern California. We are up to about 650 trees in Southern California that tested positive,” said Grafton-Cardwell. All of those infected trees were in residential yards. Therein lies the problem: An estimated 60% of homeowners have a citrus tree in their yard. “That’s like 15 million citrus trees. How do you manage a disease when you’ve got 30 million commercial trees and 15 million residential trees? It is very difficult,” she said. “Homeowners don’t understand plant disease, they don’t understand how to manage the pest, they don’t understand the risk.”

Unrelated to HLB, but nonetheless an insurance policy for all citrus growers, is Lindcove’s Citrus Clonal Protection Program (CCPP) out of UCR. Lindcove preserves and archives original budwood of citrus varieties as part of CCPP. Large screenhouses — greenhouses with screens instead of glass — hold clean budwood, which nurseries, growers and even citrus enthusiasts can use to propagate citrus plants. The citrus buds are grafted to rootstock and grown into trees in the screenhouses, where they are protected from insects.

The screens on these structures are “rated for thrips” — so fine that thrips or psyllids can’t get through it. Recently when one of the screens had a breach, the CCPP program restarted all the trees in the screenhouse to make sure they were free of insects and disease. This is serious business.

First, the network

Lindcove has a new network capability now. “We are really excited,” said Dr. Grafton-Cardwell. “It has taken us ten years to get to the point where we have a network that can support all this, because we are out in the boonies.”

Lindcove now uses the fiber network from CENIC — the non-profit network operator for the California universities, colleges, schools and libraries — and fixed wireless company GeoLinks for last-mile wireless.

“We were getting our internet from a local provider here in Visalia with limited bandwidth for a lot of money,” said Schmidt. “So now we’ve got this big connection that has the potential to have a large bandwidth. We’re in pretty good shape.”

“ANR pushed really hard in the last couple years to develop the funding to do this for all the research and extension centers, all nine of them, because we were all created back in the 1950s, and most of us in the boonies, and none of us had decent network capability. For scientists in this day and age to do research, it is totally revolutionary,” said Grafton-Cardwell. “When I first came in 1990, we weren’t able to do any of this stuff. Computing was really primitive and now it is going to improve what we do.”

Smart farm at Lindcove

“I didn’t even know what the internet of things was before Rich Wolski explained it,” said Grafton-Cardwell, but now she can’t wait to get it.

The goal of the UCSB’s smart farm test at Lindcove is to improve the decision making for frost protection for citrus growers, which should help reduce costs and carbon footprint.

Schmidt pointed out the culprit: the big wind machines on citrus farms. These wind machines are needed because the typical inversion layer of warmer air holds cold air to the ground, which damages fruit. The wind machines circulate the air when frost is imminent. It costs a lot to run the wind machines, which run on propane. That’s not even counting the cost of having to run around to the fields in a truck, taking temperature readings at all hours to make a decision when to turn on the wind machines.

Krintz and Wolski’s team of students have installed low-cost, sturdy weather stations that can withstand the elements and accurately sense temperature and humidity at 5 feet and 30 feet from the ground. The stations are installed to be able to monitor 3 feet from the boundaries of where the wind machines cover. The poles also have surveillance cameras with infrared capability to allow more temperature measurement, beyond regular thermometers. A network station in the field moves the data to the office on-site. Drones could be used “on the fly” to monitor at different levels.

Measuring and estimating the evaporation and transpiration under the tree canopy and sending that data to the office means that someone like Kurt Schmidt won’t have to manually take the temperature every hour at all hours, to determine when to turn on the fans. Also, tapping into Schmidt’s knowledge of when the fans need to be turned on will help inform the system; Krintz and Wolski can write software to automate the fans operations. Having more detailed information in real time means saving fuel if one windfan on one end of a microclimate doesn’t need to be turned on, even though others may need to run.

This frost experiment is only the beginning.

“We have a laboratory here that has equipment in it that again, we could be connecting,” said Grafton-Cardwell. “One of the things I proposed to Chandra [Krintz] and Rich [Wolski], is we have all these data in separate units. The pack line generates data, we are collecting data from the field. That is going into files. The data aren’t connected in any shape or form.”

Grafton-Cardwell’s ultimate goal is to have a researcher go into a portal and view all the data associated with their research.

This article originally appeared in the RCR Wireless News, July 17, 2018, and is re-posted with permission in the UC IT Blog.

Photo of Susan RamboSusan Rambo covers 5G for RCR Wireless News.

GeoLinks CEO Joins FCC Broadband Committee

GeoLinks Chief Executive Skyler Ditchfield has been appointed to a working group of the Federal Communications Commission’s Broadband Deployment Advisory Committee.

As a member of the Disaster Response and Recovery Working Group, Ditchfield will contribute to recommending measures to improve the broadband infrastructure before disasters happen and to restore them afterwards.

Ditchfield is the only California representative and the only fixed wireless broadband provider in the working group. His Camarillo company is a mid-sized internet service provider.

He was honored and excited to be part of the working group, Ditchfield said, adding that in the past few fire seasons his staff at GeoLinks has gained experience at restoring connectivity during natural disasters.

“I am confident our working group can not only improve the resiliency of broadband infrastructure before disasters occur nationally, but also ensure that connectivity is both maintained and restored as quickly as possible,” Ditchfield said in a statement.

Fixed Wireless Case Study by GeoLinks – The Coffee Bean & Tea Leaf

Fixed Wireless Case Study by GeoLinks – The Coffee Bean & Tea Leaf

The Coffee Bean & Tea Leaf was slated to open a series new stores in Southern California in 2016 and needed more than 30 Wi-Fi circuits to support both their public Wi-Fi and POS systems in 20 days time. The company initially contracted to provide a terrestrial connection was projecting massive delays and restrictions of available bandwidth. In order to meet their quickly approaching deadlines, Coffee Bean looked to contract an outside local provider to administer a temporary solution.

Advantages and Disadvantages of Broadband Technologies for Rural America – Infographic

Advantages and Disadvantages of Broadband Technologies for Rural America

Infographic by GeoLinks

 

BroadbandTechnologiesforRuralAmerica_GeoLinks