7 top apps for sharing large files

These days, it's common for people to get work done using more than one device -- whether you're sending out an email on your smartphone, checking Slack on your tablet or crafting spreadsheets on your PC. And it's created a demand for storage options that let you easily access files -- especially oversized files -- across numerous devices and operating systems, no matter where you are. The good news is that just one quick Google search will uncover an overwhelming number of companies that want to help you painlessly share large files.MORE ON NETWORK WORLD: 12 Free Cloud Storage options But with so many options, it can be hard to figure out which ones offer the best features at a reasonable price for your specific needs. If you just need a way to store and share your smartphone pics, then you're going to need different storage options than a freelancer, small business or large corporation. Luckily, there is an option for everyone. Here are seven of the best apps to share, store and edit large files no matter what device you're on.To read this article in full or to leave a comment, please click here

Micron’s super-fast 3D Xpoint SSDs will ship through storage partners

Intel won't be the only company gearing up to sell super-fast SSDs based on the new 3D Xpoint storage and memory technology; other storage companies will also offer them with technology provided by Micron. Micron detailed its 3DXpoint plans Tuesday at the Flash Memory Summit conference in Santa Clara, California. Intel and Micron are taking divergent paths to bring 3D Xpoint to customers. Intel will sell its own 3D Xpoint SSDs and memory DIMMs, while Micron is partnering with well-established storage companies to sell 3D Xpoint SSDs. SSDs and DRAM will run much faster with 3D Xpoint, which essentially unifies storage and memory. Intel claims 3D Xpoint will be 10 times denser than DRAM, and it has shown the technology being 10 times faster than current SSDs.To read this article in full or to leave a comment, please click here

An engineer uses IoT to tackle illness

Daniel Strabley's day job is helping to protect the U.S. from weapons of mass destruction. He works on a software suit that interacts with sensors to detect chemical and radiation threats. The sensor information, as you may imagine, is complicated, and one of his tasks is to make it understandable to users.This means that anyone, from an Army private not long out of high school to a Ph.D.-holding nuclear physicist, needs an interface that is meaningful to their knowledge level.MORE ON NETWORK WORLD: 12 most powerful Internet of Things companies The work on detecting weapons of mass destruction is similar in concept to what Strabley is doing to help his wife's grandfather, who is suffering from dementia. He has written software that can help people with varying degrees of cognitive issues, and is using sensors such as Amazon's new IOT buttons, to improve communication.To read this article in full or to leave a comment, please click here

Global IT challenges: Privacy, standardization, transformation top the list

It's one of an IT leader's biggest nightmares: Imagine you've got a division in Russia, which has very strict privacy laws regarding employees' rights to control their information, and there are 12 employees who refuse to allow their information to leave company walls.Whether it's introducing new UPS products or services such as My Choice to the marketplace, or implementing large-scale integrated capabilities like World Wide Express Saver Freight -- which rely on multiple systems -- "we choreograph the deployments across our teams and systems. We would first beta-test the releases in the individual markets such as the U.S., Europe and Asia, prior to executing the global deployment, testing all facets of the solution including the installation and functionality," Costides says.To read this article in full or to leave a comment, please click here(Insider Story)

IT managers critique enterprise flash-storage products

The market for enterprise flash array storage is highly competitive, with traditional storage vendors battling against innovative startups. Tech buyers are typically looking at factors such as ease of use, improved data management, performance, deduplication algorithms, a small footprint and low power usage.To read this article in full or to leave a comment, please click here(Insider Story)

IT managers critique enterprise flash-storage products

The market for enterprise flash array storage is highly competitive, with traditional storage vendors battling against innovative startups. Tech buyers are typically looking at factors such as ease of use, improved data management, performance, deduplication algorithms, a small footprint and low power usage.To read this article in full or to leave a comment, please click here(Insider Story)

White House software code sharing policy gains steam

The White House has released its Federal Source Code policy that promotes reuse of new source code developed by government agencies across the federal government.The new policy also sets up a pilot program “that requires agencies, when commissioning new custom software, to release at least 20 percent of new custom-developed code as Open Source Software (OSS) for three years,” Tony Scott, U.S. CIO and Anne E. Rung, chief acquisition officer, wrote in a memorandum to heads of departments and agencies on Monday.The federal government spends every year over US$6 billion on software through more than 42,000 transactions, but agencies that procure custom-developed source code do not necessarily make their new code broadly available for reuse by the federal government.To read this article in full or to leave a comment, please click here

George Washington didn’t tweet here, but you may get 5G

Some people frown on Pokémon Go hunts in historic areas, but a new FCC ruling could make it even more tempting to risk a glare and a wagging finger.On Monday, the U.S. Federal Communications Commission announced a deal to made it easier for mobile operators and building owners to install cellular gear on many old buildings, including some in historic districts. Just because those structures may evoke the past doesn’t mean they can’t have the screaming 5G wireless speeds of the future.To read this article in full or to leave a comment, please click here

CS101: Algorithms

First in this series is the subject of Algorithms. This topic is very interesting to me because when I first strived to understand what exactly they were, I was expecting something a lot more complicated than what they turned out to be. I think, shamefully, that Hollywood may have had an influence on this, as the term “algorithm” is one of many terms abused by “cyber” movies and the like, portrayed to be some sort of ultimate cyber weapon in the war against Ellingson Mineral Company.

The reality is much simpler. “Algorithm” is defined as “a set of steps that are followed in order to solve a mathematical problem or to complete a computer process”. It really is that simple. Think of a mathematical problem that you’d need to solve yourself (ignoring for the moment that there’s likely a 3rd party library that has already done this).

A common example is the calculation of the Fibonacci sequence. Forget about writing code for a minute, and think about the problem in plain English. Given a starting sequence (1, 1), how do you continue calculating and adding numbers to this sequence, to produce N number of Fibonacci numbers?

New Series: CS 101

Historically, my background is far closer to the systems side of things, but as I’ve picked up software development experience over the past few years, I’ve come to appreciate the fundamentals of computer science that others in my shoes may not have been exposed to. That said, I have been working on a pseudo-formal blog series on computer science fundamentals. These fundamentals have a wide variety of applications. Those with more of an IT-focused background will learn that even if you don’t use graph theory, or optimize algorithms in your day job, many of these concepts are at the crux of many of the technologies that we use every day.

CS101: Algorithms

First in this series is the subject of Algorithms. This topic is very interesting to me because when I first strived to understand what exactly they were, I was expecting something a lot more complicated than what they turned out to be. I think, shamefully, that Hollywood may have had an influence on this, as the term “algorithm” is one of many terms abused by “cyber” movies and the like, portrayed to be some sort of ultimate cyber weapon in the war against Ellingson Mineral Company.

New Series: CS 101

Historically, my background is far closer to the systems side of things, but as I’ve picked up software development experience over the past few years, I’ve come to appreciate the fundamentals of computer science that others in my shoes may not have been exposed to. That said, I have been working on a pseudo-formal blog series on computer science fundamentals. These fundamentals have a wide variety of applications. Those with more of an IT-focused background will learn that even if you don’t use graph theory, or optimize algorithms in your day job, many of these concepts are at the crux of many of the technologies that we use every day.

CS101: Algorithms

First in this series is the subject of Algorithms. This topic is very interesting to me because when I first strived to understand what exactly they were, I was expecting something a lot more complicated than what they turned out to be. I think, shamefully, that Hollywood may have had an influence on this, as the term “algorithm” is one of many terms abused by “cyber” movies and the like, portrayed to be some sort of ultimate cyber weapon in the war against Ellingson Mineral Company.

New Series: CS 101

Historically, my background is far closer to the systems side of things, but as I’ve picked up software development experience over the past few years, I’ve come to appreciate the fundamentals of computer science that others in my shoes may not have been exposed to. That said, I have been working on a pseudo-formal blog series on computer science fundamentals.

These fundamentals have a wide variety of applications. Those with more of an IT-focused background will learn that even if you don’t use graph theory, or optimize algorithms in your day job, many of these concepts are at the crux of many of the technologies that we use every day. If, like me, you’ve become bored with the endless cycle of IT certifications, learning these concepts could be a great addition to your skill set, as you can leverage these concepts to extrapolate details from some of the “closed” products we use from IT vendors.

Finally, it’s important to remember that the most important part of any of this is how this knowledge is applied. As you read the posts that I’ll release in the next few weeks, remember that understanding how to optimize a piece of code is useful, Continue reading

Open vSwitch Now a Linux Foundation Project

News emerged today that Open vSwitch (OVS) has formally moved over to the Linux Foundation. This is something that has been discussed within the OVS community for a while, and I for one am glad to see it happen.

Why am I glad to see it happen? The project can finally shed itself of the (unfair) claims that the governance under Nicira (and later VMware) wasn’t “open enough.” These accusations persisted despite numerous indications otherwise. Thomas Graf, an OVS committer—who does not work for VMware, for the record—came to this conclusion in his OVSCon 2015 presentation:

OVS is one of the most effective and well governed open source projects I’ve worked on.

Moving to the Linux Foundation allows OVS to continue to grow and flourish without continued accusations of unfair governance. The project intends to continue to use its existing governance model, in which technical leadership of the project is determined by the committers, and committer status is determined by your involvement in the project via code contributions and code reviews.

For more information, refer to the official Linux Foundation press release.

Deep Learning Chip Upstart Takes GPUs to Task

Bringing a new chip to market is no simple or cheap task, but as a new wave of specialized processors for targeted workloads brings fresh startup tales to bear, we are reminded again how risky such a business can be.

Of course, with high risk comes potential for great reward, that is, if a company is producing a chip that far outpaces general purpose processors for workloads that are high enough in number to validate the cost of design and production. The stand-by figure there is usually stated at around $50 million, but that is assuming a chip requires validation,

Deep Learning Chip Upstart Takes GPUs to Task was written by Nicole Hemsoth at The Next Platform.