August 22, 2017
A rack of Tegile's IntelliFlash N Series NVMe arrays scales to 60 PB of effective capacity. Tegile promises 3 million IOPS in a single array with full data services at low latency.
July 13, 2017
Scality's new open source initiative is a multi-cloud management vision to help customers match specific workloads to the best cloud service. Zenko is based on Scality S3 Server.
July 12, 2017
As a standalone company, DataGravity is no more. But the startup’s data-aware storage technology will live inside of HyTrust's cloud security products. HyTrust, based in Mountain View, Calif., ...
June 28, 2017
M-Files is adding AI functionality to its platform to better assist in document search, focusing on what a document is, rather than where it is.
Metadata Get Started
Bring yourself up to speed with our introductory content
Metadata management is the oversight of data associated with data assets to ensure that information can be integrated, accessed, shared, linked, analyzed and maintained to best effect across an organization. Continue Reading
Data context is the network of connections among data points. Those connections may be created as metadata or simply identified and correlated. Continue Reading
Adoption of cloud computing has no doubt been slowed by worries about the security of those out-of-sight servers and resources. While reasonable, those worries have given way over time to acceptance and even optimism.
In this month's Modern Infrastructure cover story, TechTarget’s Trevor Jones writes about why security issues in cloud computing are not the impediment to adoption that they once were. In fact, some organizations are coming to the conclusion that their workloads run more securely in a public cloud than in an on-premises environment. Cloud service providers possess security expertise and experience at levels that aren’t as readily available on a typical IT staff, and certifications give confidence that providers can actually do all they claim to do.
The public cloud is not a risky place to do business, though its safeguards are different from those you've put in place to protect your own data center. These differences are most clearly seen when looking at the shared-responsibility model, which addresses many of the security issues in cloud computing. Users and providers must each do their part. Otherwise, the risks will become apparent, and cloud computing won't be what you need it to be.
This issue also looks at how a new wave of products in development brings memory and storage technologies closer together. Nonvolatile dual-inline memory modules, for example, combine the speed of memory with the persistent qualities of storage in some interesting ways. Also included is an article on how some IT shops that have adopted flash storage are simultaneously impressed and disappointed with the results. Costly flash products invariably improve performance, but they won't solve every problem or clear every bottleneck.Continue Reading
Evaluate Metadata Vendors & Products
Weigh the pros and cons of technologies, products and projects you are considering.
Big data vendors routinely push the notion of ingesting all of your data into a data lake. But in many cases, doing so is an unnecessary step that could cause data ingestion problems. Continue Reading
The Salesforce developer certification measures click skills or code skills. Here is a guide to knowing the code skills necessary to pass the Force.com Developer certification. Continue Reading
The deduplication process reduces the amount of data in a storage system, but dedupe in the cloud may be more valuable to the cloud provider than the customer. Continue Reading
Learn to apply best practices and optimize your operations.
In some regards, the term big data management can be viewed as an oxymoron. In fact, oxymorons abound in this industry and society -- virtual reality, artificial intelligence, science fiction and awfully good, the latter of which can apply to the challenges encountered in managing the onslaught of big data from multiple sources. There are countless tools, techniques and practices available for the big data ecosystem to properly gather, mine, prep, store and analyze data and help smooth operations, build marketing campaigns, improve customer service and develop the next new product disruptor. As simplistic as this may sound, it's up to data managers to sort it all out as their data lakes swell beyond capacity.
"The data lake isn't where data goes to die," Gartner analyst Merv Adrian said at the 2017 Pacific Northwest BI Summit, "it's where data goes to live."
October's Business Information opens with our editor's note and advice for data managers to move beyond traditional data control to the critical task of improving data quality and delivery -- taking all that raw data and making it useful. Whether for internal or external business use, the demands for instantaneous data access continue to accelerate, spurred on by mobile apps, artificial intelligence (AI), machine learning and internet of things (IoT).
In that vein, our cover story examines companies that use their big data ecosystem to divert data lakes toward developing new strategies, products and revenue streams -- in the process, smashing their old business patterns. In another feature, IoT and machine learning technologies help take the guesswork out of estimated times of arrival for transport companies whose businesses depend on shipping and receiving goods.
Also in this issue, a business intelligence project combined three data warehouses into one to reduce warehouse size by 80% and data load time from several weeks to just days while causing IT staffing problems in the process. In other features, learn how metadata programs can ease mega management woes; semantic technology could be a blessing or curse to AI; companies are gearing up for greater big data management deployments; and all data must be treated equally in the search to find value.Continue Reading
Big data often comes with big data management problems. Clean, well-defined metadata can make the difference in analyzing big data and delivering actionable business intelligence. Continue Reading
Content management systems can help an organization support the entire content creation process, from production to delivery -- but there's much to consider in WCM systems. Continue Reading
Problem Solve Metadata Issues
We’ve gathered up expert advice and tips from professionals like you so that the answers you need are always available.
Collecting and analyzing NetFlow data can help organizations detect security incidents and figure out their cause. Expert Frank Siemons explains how NetFlow works. Continue Reading
Content without context isn't going to cut it anymore. Here's how snackable content can help shape a user's experience with your information or brand. Continue Reading
Data-aware storage yields new insights and management capabilities. But challenges include meeting IT organization's needs, tying in with existing systems and the cloud alternative. Continue Reading