Frank Konkel

House Passes Modernizing Government Technology Act

One week after Rep Will Hurd (R-TX) introduced new IT legislation to the House of Representatives, designed to thrust government into 21st-century technologies, the Modernizing Government Technology Act passed on a voice vote. “Many parts of the federal government’s IT infrastructure are stuck in the Stone Age,” said Rep Hurd, who chairs the House IT Subcommittee. “The MGT Act will save taxpayer dollars, increase government accountability, and help government be more efficient in serving the American people.”

The MGT Act is essentially a combination of two prior pieces of legislation, taking portions of its language from the MOVE IT Act Hurd introduced earlier this summer and the White House-backed IT Modernization Fund introduced by Rep Steny Hoyer (D-MD) in the spring. The bill calls for the creation of working IT capital funds in CFO Act agencies, allowing agencies to bank savings from modernization efforts afoot. As a whole, government spends approximately 80 percent of its $90 billion IT budget on legacy systems. Agencies that are able to show savings from modernization efforts would be rewarded under this bill, allowing them to use savings to fund other modernization efforts, such as moving to the cloud.

Survey Identifies Old IT Culprits As Top Barriers To More Open Government

Unlocking government data is no easy feat, and according to recent survey data gathered by the Government Business Council, the chief obstacles to a more open government are familiar problems in the IT world.

The survey tallied responses from 75 civilian and military IT leaders (GS-14 or higher), and respondents identified concerns over data sensitivity (68 percent), a perceived lack of funding (62.5 percent), privacy (61.1 percent), and unstandardized data (59.7 percent) as the chief challenges to more open data in government.

GBC conducted the survey in part to ascertain how the government would act to Congress’ passing of the Digital Accountability and Transparency Act, which mandates the executive branch publish US federal spending in open, standardized datasets readily available to the public. The DATA Act gives agencies more incentive to push appropriate data into the public eye, and it was preceded by a May 2013 executive order that spurred agency efforts to lay the groundwork for making open, machine-readable datasets the default in government.

The GBC survey suggests agencies may have made strides on some of the steps outlined on the Open Government Dashboard, and less on others one would think should have come first.

Big Data’s Coming Role In Cybersecurity

Every day, people, machines and the world’s growing multitude of sensors create more than 2.5 exabytes of data -- that’s a 2.5 followed by 18 zeros -- a bonanza of bits and bytes that is in many ways a double-edged sword.

On one hand, private sector companies and the government are able to collect more data than ever for analysis -- ideally, that’s a great thing. Never in human history has humanity had access to the kinds of data it does now. Yet big data sets are also attractive to hackers and malicious actors who see more data as more money or intelligence to steal.

The two disciplines -- cybersecurity and big data -- are beginning to meld so that it’s difficult to talk about one without the other. Agencies across government are learning to better detect and analyze cyber threats, and one of the ways they are doing so involves big data.

For example, agencies might sift through huge piles of data as they monitor traffic in and out of a network in real time to detect potentially adversarial anomalies. It takes a lot of technological horsepower to analyze that information, but the insight it provides could be the difference between a massive leak or media frenzy and business as usual.

Satellite Rescue Network Gets Space Technology Hall Of Fame Recognition

Since 1982, 37,000 people, including 7,000 Americans, survived potentially disastrous incidents because of the COSPAS-SARSAT rescue network. That record earned the satellite system an induction into the Space Technology Hall of Fame.

The honor recognizes technologies originally developed for space applications that ultimately improve live on Earth, and few technologies rival COSPAS-SARSAT in life-preserving metrics.

In 2013 alone, COSPAS-SARSAT’s network of satellites that detect and locate distress signals from emergency beacons led to the rescue of 253 people from potentially deadly situations. The network involves numerous satellites, including the National Oceanic and Atmospheric Administration’s geostationary and polar-orbiting satellites. Altogether the program comprises 43 countries and organizations.

How The Right People Analyzing The Best Data Are Transforming Government

Analytics is often touted as a new weapon in the technology arsenal of bleeding-edge organizations willing to spend lots of money to combat problems. In reality, that’s not the case at all.

Certainly, there are complex big data analytics tools that will analyze massive data sets to look for the proverbial needle in a haystack, but analytics 101 also includes smarter ways to look at existing data sets.

In this arena, government is making serious strides, according to Kathryn Stack, advisor for evidence-based innovation at the Office of Management and Budget. Interestingly, the first step has nothing to do with technology and everything to do with people. Get “the right people in the room,” Stack said, and make sure they value learning.

Finally, Stack said it’s common for agencies to tackle analytics problems by acquisition. That’s a backwards approach in which the only guarantee is that your agency is going to spend money. Instead, Stack recommended agencies “think about contractors less,” and focus first on reaching out to academic researchers, nonprofits and foundations. Don’t sleep on government peers from other agencies, either.

Feds Could Save $20 Billion with Better It Infrastructure Initiatives, Study Finds

Perhaps data center consolidation, virtualization, cloud computing, remote access and infrastructure diversification aren’t the sexiest terms in the federal repertoire, but they do hold the keys to as much as $20 billion in annual savings, according to a study by Meritalk.

The study, underwritten by Brocade, is based on survey results from 300 federal network managers who estimate that if the government were to fully leverage all five initiatives, it could save about 24 percent of the government’s $80 billion information technology budget.

The survey’s results sound promising, but there’s a caveat: Two-thirds of the surveyed network managers reported their networks are ill-equipped to meet current mission needs, and much further away from being able to fully embrace newer tech initiatives like cloud computing. If network managers could magically flip a switch and significantly increase network speed by approximately 26 percent, the survey claims the government could cash in $11 billion in savings in one year.

Lawmakers Say Favored Nsa Reform Bill Doesn’t Go Far Enough

A group of lawmakers concerned about weaknesses in the most popular surveillance reform bill circulating on Capitol Hill wants to insert an amendment that would bar the National Security Agency from weakening encryption standards or exploiting large-scale Internet security vulnerabilities.

According to a report in the Guardian newspaper, Rep Zoe Lofgren (D-CA), and other House members want to stop the NSA from “utilizing discovered zero-day flaws,” like the Heartbleed flaw made public in April that compromised countless online systems. The proposed amendment, the report claims, would also not allow the NSA “to create them, nor to prolong the threat to the Internet” by failing to warn against vulnerabilities.

NIST Removes NSA-Tainted Algorithm From Cryptographic Standards

The National Institute of Standards and Technology has finally removed a cryptographic algorithm from its draft guidance on random number generators, more than six months after leaked top-secret documents suggested the algorithm had been deliberately sabotaged by the National Security Agency.

The announcement came as NIST opened to a final round of public comments its revised Special Publication 800-90a, which contains three algorithms now that the Dual Elliptic Curve Deterministic Random Bit Generator has been removed following negative feedback from the public.

According to documents leaked by former contractor Edward Snowden in September, NSA “became the sole editor” of Special Publication 800-90 and allegedly introduced weaknesses to the now-removed algorithm.

NIST responded swiftly to that news, recommending against using the standards and suggesting reopening them to public scrutiny in an effort to rebuild trust with the public.