It’s no secret that data centers are facing power shortage issues, especially in high density areas. One colocation provider has come up with a unique solution: It’s building small nuclear power plants for itself.Data center provider Standard Power specializes in high-performance computing, such as blockchain mining and AI workloads. These kinds of workloads demand a lot of compute power, which equals a very large electric bill.The company was concerned about the ability of local electric providers to deliver the capacity needed for such demanding workloads. So, rather than rely on the local electrical grid, Standard is partnering with NuScale Power Corporation, a maker of small modular nuclear-powered plants, for its Ohio and Pennsylvania facilities.To read this article in full, please click here
Linux provides a lot of ways to display date and time information and not just for the current date and time. You can get information on dates way in the past or in the far future. You can also limit the data provided to just the current weekday or month. This post explains many of these options and provides examples of what you can expect to see.Displaying the current date
Typing “date” on the Linux command line results in quite a bit more data than just the date. It also includes the day of the week, the current time and the time zone.$ date
Mon Oct 16 11:24:44 AM EDT 2023
The command shown below gives displays the date in the shorthand month/day/year format.To read this article in full, please click here
This post covers some well-known Linux commands that, when used with particular options and arguments, can save you some time or ensure that what you are doing is what you intended. The first “trick” involves how you exit vi or vim and the difference that can make.Using :x instead of :wq when saving files with vi or vim
The vi and vim editors are the most commonly used text editors on Linux systems. When you exit either with :x instead of the more common :wq, the file will only be saved if you have just made changes to it. This can be helpful if you want to ensure that the file’s timestamp reflects its most recent changes. Just keep in mind that, if you make a change to a file and then undo those changes – like deleting a word or a line and then replacing it with the same content, this will still be seen as a change and vi or vim will save the file, updating the timestamp whether you use :x or :wq.To read this article in full, please click here
Immersion cooling specialist LiquidSack has introduced a pair of modular data center units using immersion cooling for edge deployments and advanced cloud computing applications.The units are called the MicroModular and MegaModular. The former contains a single 48U DataTank immersion cooling system (the size of a standard server rack) and the latter comes with up to six 48U DataTanks. The products can offer between 250kW to 1.5MW of IT capacity with a PUE of 1.02. (Power usage effectiveness, or PUE, is a metric to measure data center efficiency. It’s the ratio of the total amount of energy used by a data center facility to the energy delivered to computing equipment.)To read this article in full, please click here
Dell Technologies is expanding its generative AI products and services offerings.The vendor introduced its generative AI lineup at the end of July, but that news was centered around validating existing hardware designs for training and inferencing. Dell's new products are models made for customization and tuning.The name is a mouthful: Dell Validated Design for Generative AI with NVIDIA for Model Customization. The solutions are designed to help customers more quickly and securely extract intelligence from their data.There may be a race to move anything and everything to the cloud, but that doesn’t include generative AI, according to Dell's research. Among enterprises surveyed by Dell, 82% prefer an on-premises or hybrid solution to AI processing, said Carol Wilder, Dell's vice president for cross portfolio software and solutions.To read this article in full, please click here
By default, processes run on the Linux command line are terminated as soon as you log out of your session. However, if you want to start a long-running process and ensure that it keeps running after you log off, there are a couple ways that you can make this happen. The first is to use the nohup command.Using nohup
The nohup (no hangup) command will override the normal hangups (SIGHUP signals) that terminate processes when you log out. For example, if you wanted to run a process with a long-running loop and leave it to complete on its own, you could use a command like this one:% nohup long-loop &
[1] 6828
$ nohup: ignoring input and appending output to 'nohup.out'
Note that SIGHUP is a signal that is sent to a process when the controlling terminal of the process is closed.To read this article in full, please click here
Juniper Networks today said it is laying off 440 workers amidst a $59 million restructuring plan.The restructuring strategy is the result of a review of the company’s business objectives, and it is intended to focus on realigning resources and investments in long-term growth opportunities, the networking vendor wrote in an SEC 8K filing.“The company believes the plan will further allow it to continue to prudently manage operating expenses in order to deliver improved operating margin,” Juniper wrote. “Total costs currently estimated to be incurred in connection with the Plan are approximately $59 million, of which approximately $48 million are expected to result in cash expenditures.”To read this article in full, please click here
Linux systems provide a very easy-to-use command for breaking files into pieces. This is something that you might need to do prior to uploading your files to some storage site that limits file sizes or emailing them as attachments. To split a file into pieces, you simply use the split command.$ split bigfile
By default, the split command uses a very simple naming scheme. The file chunks will be named xaa, xab, xac, etc., and, presumably, if you break up a file that is sufficiently large, you might even get chunks named xza and xzz.Unless you ask, the command runs without giving you any feedback. You can, however, use the --verbose option if you would like to see the file chunks as they are being created.To read this article in full, please click here
The echo command (a bash built-in) is one of the very basic commands on Linux. As with ls and pwd, you can't sit on the command line very long without using it. At the same time, echo has quite a few uses that many of us never take advantage of. So, this post looks into the many ways you can use this command.What is the echo command on Linux?
Basically, echo is a command that will display any text that you ask it to display. However, when you type “echo hello”, the echo command isn't only spitting out those five letters, it's actually sending out six characters – the last one being a linefeed. Let's look at a couple of commands that make this obvious.To read this article in full, please click here
Intel announced plans to spin off its programmable solutions group (PSG) as a standalone business beginning January 1, with an IPO to follow in about two to three years. The group will operate as a separate unit in the company’s financials.PSG is the group that handles field-programmable gate array (FPGA) processors made by Altera, which Intel acquired for $16.7 billion in 2015. Sandra Rivera, who currently runs Intel’s data center and AI (DCAI) group, will lead PSG as CEO. The company also plans to search for a new chief financial officer for the group.To read this article in full, please click here
In this Linux tip, we’re going to look at a way that you can view every Nth line in a text file – whether every other line, every third line, every eleventh line or whatever you want to see.
Just weeks after Cisco killed its Hyperflex platform and turned that business over to Nutanix, the vendors rolled out their first integrated hyperconverged infrastructure (HCI) package aimed at easing hybrid- and multi-cloud operations.HCI platforms combine storage, compute, networking and virtualization resources in a single system. The newly available Cisco Compute Hyperconverged with Nutanix combines Cisco’s SaaS-managed compute and networking gear with Nutanix’s Cloud Platform, which includes Nutanix Cloud Infrastructure, Nutanix Cloud Manager, Nutanix Unified Storage, and Nutanix Desktop Services. The system can be centrally managed via Cisco’s cloud-based Intersight infrastructure operations platform, and it supports Nutanix Acropolis Hypervisor (AHV) and VMware vSphere hypervisors.To read this article in full, please click here
As National Cybersecurity Awareness Month kicks off, it's a good time to reflect on how secure the systems you manage are – whether they’re running Linux, Windows or some other OS. While Linux is considered by many to be more secure due to its open-source nature and because privileges are clearly defined, it still warrants security reviews, and this month's focus on cybersecurity awareness suggests that an annual review is more than just a good idea.The designation became official in 2004, when President George W. Bush and Congress declared October to be National Cybersecurity Awareness Month. Keep in mind that in 2004, security practice often involved little more than updating antivirus software. Today, cybersecurity practices are much more intense as the threats have grown to be far more significant and far more challenging.To read this article in full, please click here
In this Linux tip, we’re going to look at the comm command – a Linux command that provides a report on the lines that are common in two text files along with those that only exist in one file or the other.
Schneider Electric is warning the demands of power and cooling for AI are beyond what standard data center designs can handle and says new designs are necessary.That may be expected from a company like Schneider, which makes power and cooling systems used in data centers. But it doesn’t mean Schneider isn't correct. AI is a different kind of workload than standard server-side applications, such as databases, and the old ways just don’t cut it anymore.Schneider's white paper notes that AI needs ample supply of three things: power, cooling, and bandwidth. GPUs are the most popular AI processors and the most power intensive. Whereas CPUs from Intel and AMD draw about 300 to 400 watts, Nvidia’s newest GPUs draw 700 watts per processor and they are often delivered in clusters of eight at a time.To read this article in full, please click here
SambaNova Systems, maker of dedicated AI hardware and software systems, has launched a new AI chip, the SN40, that will be used in the company’s full-stack large language model (LLM) platform, the SambaNova Suite.First introduced in March, the SambaNova Suite uses custom processors and operating systems for AI inference training. It's designed to be an alternative to power-hungry and expensive GPUs.To upgrade the hardware so soon after launch means that there ought to be a big jump in performance, and there is. The SN40L serves up to a 5 trillion parameter LLM with 256K+ sequence length possible on a single system node, according to the vendor.To read this article in full, please click here
When it comes to archiving data, there are three different approaches, generally speaking. Selecting the right system hinges on technical capabilities as well as external factors such as budget constraints. Enterprise storage pros need to balance data- preservation, accessibility, and resource-optimization requirements as they weigh the various archive systems available in the market. Let's take a deeper look into the different types of archive systems.Traditional batch archive
With a traditional batch archive, data serves its purpose for a certain period before being tucked away in a safe repository, awaiting the possibility of being of some use in the future. The main idea behind this type of archive is to preserve data over an extended timeframe, while keeping costs at a minimum and ensuring that retrieval remains a breeze even years down the line. In this kind of archive system, each collection of data selected for archiving is given one or more identities, stored as metadata alongside the archived data. This metadata plays a pivotal role in locating and retrieving the archived information, with details such as project names, tools used for to create the data, the creator’s name, and the creation timeframe all forming part of this digital fingerprint. Continue reading
Creating energy-efficient and sustainable data centers makes a lot of sense from a business standpoint. Aside from the obvious environmental impact of lower carbon emissions, the potential business benefits include lower operating costs, reduced space requirements and a positive brand image.There’s another good reason for building more sustainable and energy-efficient data centers: Regulations and standards are emerging around the world that will require or recommend such actions.IT and networking executives and their teams need to get up to speed on a host of sustainability regulations and standards that are going to require a response on their part. Energy efficiency and sustainability are not just issues for facilities teams anymore. They are a concern for IT teams that will be asked to provide metrics, so the need for reporting will become more urgent. They will also need to select more energy-efficient hardware.To read this article in full, please click here
Content delivery network (CDN), security and web services company Cloudflare is opening its worldwide network to companies looking to build and deploy AI models with new serverless AI, database and observability features, working with several new tech partners to do so.Part one of Cloudflare’s new AI-focused initiative, announced today, is the Workers AI framework, which offers access to GPUs in Cloudflare’s network for a serverless way to run AI models. For users trying to run AI systems that are heavily latency dependent, the framework should offer the option of running workloads much closer to the network edge, reducing round-trip time. The company said that Workers AI is also designed to separate inference from training data, ensuring that consumer information is not misused.To read this article in full, please click here