Research firm IDC announced Friday that global enterprise storage systems factory revenue increased 7.2 percent year over year to nearly US$10.6 billion during the fourth quarter of last year, while total capacity shipments were up 43.7 percent year over year to 31.8 exabytes during the quarter.

The spending for the full year rose 3.6 percent to $36.2 billion, while whole-year capacity consumption was up 43 percent to 99.2 exabytes.

EMC completed the fourth quarter in the top position within the total worldwide enterprise storage systems market, accounting for 22.2 percent of all spending. The second position was held by HP, which captured 13.8 percent of spending during the quarter. Dell and IBM ended the quarter in a statistical tie, with each accounting for 9.0 percent of global spending.

As a single group, storage systems sales by original design manufacturers (ODMs) selling directly to hyperscale datacenter customers accounted for 12.8 percent of global spending during the quarter.

EMC was also the biggest external storage systems supplier during the quarter. The company accounted for 32.9 percent of sales, which was the same share as the previous year. IBM and NetApp generated 11.7 percent and 10.7 percent of total sales during the quarter, respectively. HP generated 9.6 perent of the revenue during the quarter.

The total open networked disk storage market (NAS combined with non-mainframe SAN) increased 6.1 percent year over year to $6.3 billion in revenue. EMC maintained its leadership in the total open networked storage market with 35.7 percent revenue share. NetApp and IBM generated 12.1 percent and 11.1 percent of revenue respectively.

“Fourth quarter spending on enterprise storage systems was up strongly in most major geographic markets, driven by traditional year-end seasonality, demand for midrange systems that incorporate flash capacity, and continued growth of systems designed for hyperscale datacenters,” said IDC’s reseach director for storage Eric Sheppard.

Microsoft announced Thursday that the software giant is providing new Azure services and updates across big data and media services aimed at helping users in their business transformation journey.

Developers want to build cloud based applications that support multiple platforms and different concurrent versions between multiples applications for user-generated content, Internet of Things, and gaming scenarios. They want these applications to deliver high-scale and reliable performance. NoSQL has emerged as the leading category of database technology to address these needs.

Generally available on next month, Azure DocumentDB is a fully-managed, highly-scalable NoSQL document database service to feature rich query and transactional processing over a schema-free JavaScript Object Notation (JSON) data model, which helps enable rapid development and high performance for applications.

At general availability, DocumentDB will be available in three standard performance levels: S1, S2, and S3. Collections of data within a DocumentDB database can be assigned to different performance levels allowing customers to purchase only the performance they need. Several enhancements have been made in preparation for general availability, including Hadoop integration, a Java SDK, hourly billing, support for larger documents, SQL parameterization, additional regions, and larger account sizes.

DocumentDB allows developers to build new applications that store, query and process data without rigid constraints on schema or the need to manage infrastructure, so as to deliver a database service that allows users to focus more on applications and less on infrastructure. As a managed service, DocumentDB removes the heavy lifting of provisioning, managing and scaling virtual machines, while creating a database account in minutes, provision the resources needed and scale the database as application grows.

DocumentDB can be used to store heterogeneous JSON documents and query these documents through a familiar SQL syntax with language integrated JavaScript. DocumentDB’s query language supports rich relational and hierarchical queries through a familiar SQL dialect. It is rooted in JavaScript’s type system, expression evaluation and function invocation model.

The NoSQL database service has been developed to enable applications to store and query documents concurrently, with consistent results for users. DocumentDB is write-optimized and schema-agnostic to serve consistent queries in the face of a sustained volume of document writes. By default, the database engine automatically indexes all documents and properties without requiring schema or secondary indexes.

It also offloads complex transactional processing to the database tier through DocumentDB’s multi-document, transactional execution of stored procedures and triggers. Stored procedures and triggers allow users to process JSON documents through small JavaScript programs executed within the database. DocumentDB’s deep integration of JavaScript eliminates the impedance mismatch between programming languages/type-systems and the database schema.

The service also tunes data consistency to achieve the best application performance with DocumentDB. Well defined, predictable and easy to use; these tunable data consistency options allow users to make deterministic tradeoffs between read consistency and performance. DocumentDB’s session consistency offers read-your-write guarantees with high performance reads and writes.

Microsoft also announced general availability of Azure Search, a search-as-a-service solution that helps developers build sophisticated search experiences into web and mobile applications. Azure Search enables developers to reduce the friction and complexity of implementing full-text search, and differentiate their applications by leveraging powerful features not available with other search packages such as enhanced multi-language support.

This release offers support for more than 50 languages—utilizing the natural language processing technology used by Microsoft Office and Bing. Updated features include the ability to more easily load data from Azure DocumentDB, Azure SQL Database, and SQL Server on Azure VMs to Azure Search using new indexers. Plus, a .NET software development kit (SDK) is now available to make working with Azure Search a more familiar experience.

This version of Azure Search will provide a number of features including one to load data from Azure DocumentDB, SQL Server running in Azure VMs, and Azure SQL Database to Azure Search using new indexers. The indexer infrastructure provides a no-code solution that allows users to point Azure Search to data store where it will ingest data as well as data changes on a scheduled basis.

Microsoft also released into preview Azure Media Services Premium Encoder to provide advanced encoding capabilities for premium on-demand media workflows. Premium Encoder offers superior quality and flexibility suited for broadcast industry/professional media transcodes. This includes automated decision-making logic that can adapt to a variety of input file formats, support for additional input and output codecs and file formats such as 4K Ultra HD in AVC and closed captions, and a workflow design tool.

In addition, Azure Media Services is now fully integrated with Azure Content Delivery Network (CDN). CDN integration with Azure Media Services provides point-and-click provisioning of edge services, speeding time to market and giving content global reach.

Microsoft also made generally available A10 and A11 instances that feature faster processors, more virtual cores for greater compute power and larger amounts of memory. A10 instances have 8 virtual cores and 56 GB of RAM, while A11 instances have 16 virtual cores and 112 GB of RAM. With these instances, customers can run compute-intensive applications such as video encoding, risk modeling, simulation, and more. These new instances offer more options for customers who do not require the InfiniBand networking capability that comes with A8 and A9 instances.

Azure Active Directory now allows the assignment of shared application accounts to groups. This is particularly useful for managing access to company owned social applications accounts such as Twitter, Facebook or LinkedIn that many users want to access. Previously, the real password of the application was known only by the administrator while everyone else had access by using their work accounts. By improving the security of this process through a public preview that allows customers to automatically change the application password with randomly created strong complex passwords, in custom intervals.

Proofpoint launched a real-time social media security operations console that aggregates security intelligence across social accounts and social networks.

Backed by a social media security database and five patent-pending social media security technologies, Nexgate Social Threat Center detects the myriad of social threats that brands face and measures the effectiveness of existing security policies.

A division of Proofpoint, Nexgate offers social media security and compliance, while its researchers and automated scanners constantly analyze millions of social media posts to develop a database of malicious content, attack patterns, applications, perpetrators, and compliance violations. This intelligence is then continuously fed into Nexgate policy engines to ensure that customers receive up-to-the-minute protections against the widest range of social threats.

A sampling of data covered as of last July includes over 300,000,000 scanned posts, over 25,000,000 scanned users, more than 150,000 branded accounts scanned, over 10,000 known social media threat perpetrators, over 6,000,000 malicious, inappropriate, and compliance content patterns, 110 content categories such as spam, malware, hate and adult language, 35 compliance content policy categories such as FFIEC, credit card numbers, earnings and financial updates, along with seven pre-built, industry specific content policy templates.

Currently available, the Social Threat Center prioritizes risk and allows teams to take immediate protection action where it matters most. Once threats are identified, the Social Threat Center divides social media threats by specific risk type and category including suspicious social accounts, bad actors, malicious content, hacking risk, compliance risk, application control and abusive content.

In addition, the Social Threat Center tracks and measures the effectiveness and ROI of the social media security policies used by an organization. No other solution matches this breadth of visibility, protection and security effectiveness measurement.

Proofpoint Nexgate’s in-depth Social Threat Intelligence Database powers the visibility, protection and effectiveness reporting in the new Social Threat Center. Proofpoint researchers and automated scanners constantly analyze millions of social media posts and web content to feed the Nexgate Social Threat Intelligence Database.

This repository includes malicious content, attack patterns, applications, perpetrators and compliance violations. To ensure customers receive up-to-the-minute protection against the widest range of social threats, this intelligence is continuously fed into the Proofpoint Nexgate policy engines.

Earlier this week, Proofpoint entered into a definitive agreement to acquire Emerging Threats, vendor of advanced threat intelligence, for approximately $40 million in cash and stock.

Proofpoint will integrate Emerging Threats’ advanced threat intelligence with its existing Targeted Attack Protection and Threat Response security solutions to deliver another significant step forward in advancing the state of the art for advanced threat detection and response, across the complete attack chain.

Emerging Threats uses an automated collection and analysis system, along with a team of expert threat researchers, to produce actionable threat intelligence for detecting, blocking and remediating advanced cyberattacks. The combined technology will provide customers with deeper insight into cyberthreats, enabling them to react faster to inbound cyberattacks, and to identify, block, and disable previously undetected malware already embedded in their organizations.

The combination of Emerging Threats’ threat intelligence with Proofpoint’s existing big data platform and threat analytics systems will provide compounded benefits in both detection and response capabilities. The integrated offerings extend Proofpoint’s capabilities in detecting advanced malware propagated through email and social media messaging systems.

The result is a unique, end-to-end view into the entire kill chain of cyberattacks and cyberattackers. This unmatched insight into the attack chain provides improved threat detection and faster, automated incident response and remediation for organizations worldwide.

IBM advanced on Wednesday expands its Watson ecosystem by acquiring AlchemyAPI, provider of scalable cognitive computing application program interface (API) services and deep learning technology. The acquisition will accelerate IBM’s development of cognitive computing applications. Financial terms of the deal were not disclosed.

This acquisition builds on the recent announcement of IBM Watson Personality Insights API and the launch of five new beta Watson Services, taking the total number of cognitive API services to thirteen in the past six months alone. From last month, users can access free beta services on Bluemix on speech to text, text to speech, visual recognition, concept insights and tradeoff analytics.

IBM Watson enhances, scales and accelerates human expertise, and represents a new era of computing in which apps and systems interact with human users more naturally, augment knowledge with Big Data insights, and learn and improve over time. Fueled by innovation and a mission to transform industries and professions, IBM offers a host of cloud-based products and services to clients in industries such as banking, healthcare, insurance, retail and education.

Last January, IBM launched its Watson unit, a business dedicated to developing and commercializing cloud-delivered cognitive computing technologies. The move signified a strategic shift by IBM to deliver a new class of software, services and apps that think, improve by learning, and discover insights from massive amounts of Big Data.

IBM is investing $1 billion into the Watson unit, focusing on development and research, and bringing cloud-delivered cognitive applications and services to market. This includes $100 million available for venture investments that support IBM’s ecosystem of start-ups and businesses building cognitive apps powered by Watson.

IBM will integrate AlchemyAPI’s deep learning technology into the core Watson platform, augmenting Watson’s ability to identify hierarchies and understand relationships within large volume data sets. The technology is expected to enhance Watson’s ability to ingest, train and learn the “long-tail” of various data domains, including general business and target industries, as well as address the need to manage constantly evolving ontologies.

In addition, the acquisition will enhance the number and types of scalable cognitive computing APIs available to IBM clients, developers, partners and other members of the Watson ecosystem. This includes language analysis APIs to address new types of text and visual recognition, and the ability to automatically detect, label and extract important details from image data.

AlchemyAPI is a web service that analyzes unstructured content like news articles, blog posts and images, exposing the semantic richness in data. The company’s natural language processing and computer vision technology analyzes text, image, or web-based content, identifying images, named entities such as people, locations and companies, facts and relations, topic keywords, text sentiment, news and blog article authors, taxonomy classifications, and scraping structured data.

AlchemyAPI can be used directly through its Internet-accessible REST interface, or through any of downloadable software developer kits. SDKs are provided in over a half-dozen programming languages, enabling mixing of AlchemyAPI content analysis capabilities into web, mobile or standalone application.

AlchemyAPI supports more than a half-dozen spoken languages, including English, Spanish, German, Russian, Italian, and more. Its multi-lingual support is comprehensive; language coverage extends to the native tongue of more than 1.1 billion individuals.

“IBM continues to invest in Watson‘s core technology and cloud development platform, amplifying a robust Watson ecosystem where third party organizations are creating new businesses and solutions powered by Watson,” said Mike Rhodin, senior vice president, IBM Watson. “Our ability to draw upon both internal and external sources of innovation, from IBM Research to acquisitions like AlchemyAPI, remain central to our strategy of bringing Watson to new markets, industries and regions.”

“Today is the start of a new journey for AlchemyAPI, our customers and our user community, as we join IBM, the leader in cognitive computing,” said Elliot Turner, founder and CEO, AlchemyAPI. “We founded AlchemyAPI with the mission of democratizing deep learning artificial intelligence for real-time analysis of unstructured data and giving the world’s developers access to these capabilities to innovate. As part of IBM’s Watson unit, we have an infinite opportunity to further that goal.”

Watson is the initial commercially available cognitive computing capability representing a new era in computing. The system, delivered through the cloud, analyzes high volumes of data, understands complex questions posed in natural language, and proposes evidence-based answers. Watson continuously learns, gaining in value and knowledge over time, from previous interactions.

IBM is delivering new Watson services and APIs through the Watson Zone on Bluemix, the company’s digital innovation platform that enables developers to rapidly build, deploy and manage apps across any combination of public, private and hybrid cloud. Thousands of developers, entrepreneurs, data hobbyists, students and others have already built more than 7,000 apps powered by Watson to date.

AlchemyAPI’s capabilities, including new language analysis and visual recognition services, will be delivered through Bluemix, and enable developers to quickly build a range of business applications. New and existing users can sign up for a free trial and access the AlchemyAPI services.

by Dirk Paessler

IT departments and technology vendors have two very distinct viewpoints. IT is laser-focused on “right now,” working diligently to assure business processes and providing services to end users. Technology vendors are in the futures business, constantly selling IT (or, as is often the case, investors) on how to solve “next generation” problems. The disconnect between the two is that in many cases, rank-and-file IT employees do not have next generation problems – they have “right now” problems.

The latest and greatest from leading technology companies promise IT a brave new world, but that doesn’t help an IT department fix a server hangup, improve VoIP call quality or right-size a virtual machine. Technology thought leaders have done a fantastic job winning over venture capitalists, public markets and media, but the IT departments, especially at small businesses, are left behind to do much of the heavy lifting themselves.

IT pros need technology companies that are in their corner, ones that understand the issues they face and have ready-made solutions that are available, not “in the pipeline.” Anyone in the trenches knows that it’s the everyday problems that keep IT up at night. As with many problems, it is important to be proactive in solving them. IT infrastructure is increasingly complex and needs to be operational 24/7. The only way to truly ensure services is to deliver a healthy dose of preventative medicine – the kind that comes from infrastructure monitoring. Outages and crashes are incredibly costly to repair, but preventing them is often times a matter of identifying a minor issue before it becomes a major problem.

While many technology companies are focused on tomorrow, infrastructure monitoring firms are focused on today, helping to solve the problems that IT departments struggle with on a daily basis.

Identifying hardware issues before they happen

IT infrastructure needs constant review, but short-staffed departments do not have the resources to manually check everything that needs attention on a consistent basis. And yet, there are a variety of ways to extract data from hardware systems to gain a deeper understanding of everything from why applications are running slow to the temperature of the servers. Through infrastructure monitoring, IT can get a grip on current trends and understand how resources are used, which makes it easier to prevent unexpected events like server outages. Improving access to the most important details of IT infrastructure is critical.

Less hardware, more problems?

Virtual machines are out of sight, but are never out of mind. It is difficult enough for IT to right-size and provision VMs when they deploy them, but they then have to maintain network speed and performance. Virtualization provides tremendous flexibility, but with that comes the real need to keep track of VMs, how they are used, CPU and RAM, and more. Technology has given IT incredible power through virtualization, but more needs to be done in terms of helping IT tame that power.

Failing Windows Services and Server Hang-Ups

For better or for worse, email drives businesses and most businesses run Outlook. Unfortunately, Outlook issues are one of the most common problems IT has to solve, and they are often time sensitive – lost productivity means lost business. In many cases, the easy fix to Outlook issues is to simply reboot the server it’s running on, which may sound simple, but if email hosted in New York goes down for a company that does business, or has offices, around the world, it’s a major problem. Giving IT the power to set customized alerts and receive them through email, SMS or push notifications gives them immediate notice when something is wrong and, more importantly, time to fix a minor issue before it becomes a major problem.

Spotting attacks and keeping data safe

It has become clear that antivirus alone is not enough to defend against the threats businesses face, but at many small businesses, antivirus is all they have. Yet, there is an enormous amount of network data available that can tip off IT to a cyberattack. Visibility into networks can reveal the types of behavior associated with malware, whether its uncommon traffic, CPU spikes, or brute force login attempts. IT needs all products, not just security products, to provide the kind of visibility that will let them address malicious behavior as well as run of the mill performance issues.

Being a friend to IT – even the little guy

The explosive growth of IT has not, and likely will never be, matched by an explosive growth in IT staffing. Especially at small businesses, IT professionals have to learn to do more with less. While technology vendors can’t change that, they can do more to support them. By focusing on everyday problems, infrastructure and network monitoring companies help IT proactively solve problems and keep an eye on the entire infrastructure, even when they can’t. It’s easy for technology companies to cast an eye to the future, but when it comes to what keeps IT administrators up at night, solving “right now” problems is just fine.

Dirk Paessler is the CEO and founder of Paessler AG, makers of the PRTG network monitoring software.

Emulex released Wednesday a new line of its network monitoring and visibility portfolio, architected to support the demands and complexity of 10Gb, 40Gb and 100Gb Ethernet (100GbE)-based networks.

The new EndaceProbe Intelligent Network Recorders (INRs) and EndaceVision 6.0 Network Visibility Software allows users to manage and make sense of the massive network traffic volumes created by continual business demands, while decreasing costs and effort linked with managing physical, virtual and software-defined networks by enabling network and security professionals to rapidly capture, analyze and diagnose potential and actual network and security issues in real-time.

When combined with EndaceVision 6.0, EndaceProbe INR network visibility appliances allow network operations teams to identify issues, and optionally pass the relevant data to third party tools, thereby enhancing root cause isolation and mean time-to-resolution (MTTR).

The ability to view network incidents with microsecond-level accuracy is unprecedented in the Application Aware-Network Performance Management (AA-NPM) and security space.

EndaceProbe INRs comes with a hypervisor to permit analysis management, and security applications to be hosted in a VM and have direct access to packet data at the point of capture. Since multi-gigabit capture files can be large and cumbersome to send to analysis applications, the ability to load third party software directly onto the EndaceProbe INRs enables analysis software to be added or moved to the data instead of moving the data to the software.

EndaceProbe 4104 and 8004 INRs provide 10GbE write to disk rates, minimizing total cost of ownership while providing the industry’s only 100 percent accurate packet capture on multi-10GbE networks. EndaceProbe INRs also include increased compute power and memory that reduces the hardware footprint needed to support the analysis of multi-10GbE networks.

The EndaceProbe INRs include three new options; the 4104, 4004, and 8004. The EndaceProbe 4104 INR – with highly reliable solid state disk (SSD) primary storage – is ideal for on-demand recording across multiple 10GbE networks or wherever physical access is a challenge. The EndaceProbe 4004 INR is a cost effective 1RU appliance for lower packet rate multi-1GbE or 10GbE network links, ideal for remote locations.

The EndaceProbe 8004 INR is a 2RU appliance providing ultra-fast, highly reliable disk storage for packet capture on 10GbE or higher networks.  It is ideally suited for the core, distribution, and access layers within data centers and large campus environments for continuous high-speed recording and fast retrieval during traffic analysis.

A purpose-built, web-based network history visualization application for high speed networks, EndaceVision 6.0 comes bundled with all EndaceProbe INRs. Using EndaceVision, network operations, security, or incident response professionals can navigate quickly through network history data – both packets and metadata – to identify the cause of network issues and take appropriate action – without guesswork.

EndaceVision includes Endace MicroVision to give network managers microsecond-level views into the captured packet data, allowing users to troubleshoot disruptive high speed network issues with more detailed and accurate inspection of shorter time periods. For 10GbE, the time to transmit each packet of information has decreased by ten times as the amount of traffic has increased by ten times, exponentially increasing the amount of data for an IT analyst to cull through when troubleshooting a network or application issue. Endace MicroVision allows analysts to focus on the problem in small timeframes, reducing the distraction of additional data and enhancing MTTR.

EndaceVision also includes the TCP Flags visualization and client and server breakdowns, leveraging TCP’s intrinsic statefulness and explicit signaling to assist in monitoring and debugging transport services across the network. It also meets the requirements of the utilities industry by including automatic ICS/SCADA protocol detection and visualization, apart from full packet-level decodes in the web-based EndacePackets viewer, enabling analysts to ensure the stability and security of utility network traffic.

Imation Corp. unveiled a new architecture which gives enterprises a holistic approach to actively manage their high-value data files and protect them from tampering, destruction, loss or leakage through their entire lifecycle. It also empowers end users to manage their own policy for protecting their high-value data in addition to enabling organizational control and management policies.

The Secure Data Movement Architecture (SDMA) safeguards high-value data from hand to cloud and creation to destruction, and manages it through policy-based control and security, Imation announced Tuesday. It also addresses the need to safeguard data files throughout the data lifecycle, including archive, remote workers, file sharing and collaboration, storage management and cloud and mobile device endpoints.

Leveraging Imation’s technique in both storage-side and client-side data security solutions, SDMA comprises a suite of security capabilities that provide protection at every stage of data’s lifecycle (storage administration, file sharing & collaboration, server archive and cloud archive). This results in enterprises knowing that data is protected from loss, destruction and leakage on any device – mobile to cloud with control and audit access to the data.

SDMA protects high-value data throughout the broader information ecosystem by drawing upon core capabilities found within Imation’s Nexsan Assureon Secure Archive, IronKey Secure Storage and IronKey Workspace with Windows To Go solutions that are currently in use. Assureon provides a comprehensive secure archive solution that protects and preserves high-value data.

Imation’s Nexsan Assureon secure storage solutions brings down storage costs by offloading from primary storage any data that is infrequently-accessed or has aged by policy. Through policy automation, Assureon can also eliminate or greatly reduce the size, cost and complexity of backups for both primary and seldom-used data. Assureon includes multi-tenancy—with secure copy creation, data movement and long-term storage—and chargeback capabilities for public and private cloud deployments.

IronKey Workspace with Windows To Go is a Microsoft-certified solution that comes complete with applications, security controls and access policies stored on a ruggedized, fully manageable USB flash drive. With IronKey, users will be offered a fully-functional version of Windows 8.1 and transform any computer into an IT-managed workstation, at a savings of up to 90 percent compared to the cost of a comparably-equiped new laptop.

IronKey Secure Storage solutions utilize hardware-encrypted USB flash drives to provide secure portable data storage for mission-critical mobile workforce and the invaluable data they carry.

Data integrity features like file-fingerprinting and automated self-healing integrity checks ensure that high-value data is constantly protected. Assureon’s security features comply with corporate and governmental regulatory requirement, thus making it well suited for medical, financial and government organizations.

IronKey mobile storage solutions enable users to securely access their workspaces and files remotely while adhering to corporate IT standards, and IronKey Workspace with Windows To Go is a PC on a Stick that equips employees and contractors with a portable Windows 8.1 corporate image.

A recent Imation survey of end users revealed 70 percent did not classify high-value data, highlighting the organizational disconnect of protecting critical files. Organizations attempt to define or dictate policies or deploy point solutions in an attempt to regain control over their high-value data; this has only succeeded in frustrating and hampering the end user’s ability to do more with their data.

End users want their data safeguarded, but they also want to be productive when they are away from the office or working with external contractors and consultants. When faced with cumbersome policies or solutions, end users either bypass them or limit their effort to the boundaries placed on them.

In addition, information technology and security professionals are increasingly challenged by the pace in which data files move both internally and externally. It’s hard for them to manage data across all of the physical and other storage devices that data lives on – personal computers, servers, mobile devices and/or the cloud – and to keep that data from being lost or stolen. These challenges only get more difficult as data growth continues to explode and more data files are seen as important or mission-critical business assets.

“Imation’s Secure Data Movement Architecture is an emerging real-world solution to today’s unstructured data management challenge. Data growth continues at exponential rates, and this data is an attractive target for cybercriminals looking to score big. Despite their best efforts organizations remain vulnerable to many potential problems relating to storing and securing their most important data, which is one of the reasons information security is the most important IT priority for 2015,” said Terri McClure, senior analyst, ESG. “Imation is uniquely positioned to lead the way forward for the industry with its pedigree in storage and security, its Nexsan family of data storage solutions and its IronKey mobile security solutions.”

“Security and storage can no longer be siloed. Imation is introducing comprehensive and innovative protection for high-value data files,” said Ian Williams, president of tiered storage and security solutions at Imation. “Organizations can no longer throw additional point solutions and hardware at the problem – we must take a holistic approach to data loss and leakage issues. The Secure Data Movement Architecture solves these problems in a comprehensive way and provides a framework to protect important data files. No matter where a user is, how much data they create or what device or solution they use to create and house their data – SDMA is designed to safeguard their most important information assets with the utmost confidence.”

VMware debuted on Tuesday its vCloud Air for government and public sector enterprises to purchase, both directly and as part of the G-Cloud 6 framework – the latest version of the government’s cloud-based services network. This offering will help boost efficiencies and bring down costs using VMware’s Hybrid Cloud platform to change their IT infrastructures.

G-Cloud 6 is a set of structures for the public sector to buy cloud-based services through a digital marketplace. Public sector bodies can review the cloud services available and buy them through the Digital Marketplace. VMware has been awarded a place on the Framework agreement for the supply of VMware vCloud Air Dedicated Cloud, VMware vCloud Air Virtual Private Cloud and VMware vCloud Air Disaster Recovery as a Service.

VMware vCloud Air enables public sector organizations to extend their on-premises IT infrastructure seamlessly to the public cloud. VMware vCloud Air can integrate seamlessly with the existing estate, giving the ease and flexibility to extend workloads, even mission critical applications, into the public cloud as and when additional capacity is needed. The resulting hybrid cloud is compatible with existing public sector applications and also makes it easy to build new cloud-native applications, delivering agility and efficiency in a secure, reliable and compliant manner.

IT departments can view, manage and operate this “best of both worlds” cloud environment in a unified way using the VMware vSphere platform they already know and trust. As the data center for the service is based in the UK, all citizen data is stored under UK and EU compliance and data sovereignty standards.

Vmware launched last month enterprise-class hydrid cloud service for U.S. public sector organizations to realize the government’s Federal Risk and Authorization Management Program (FedRAMP) provisional authority to operate. The general availability of VMware vCloud Government Service provided by Carpathia has achieved the Provisional Authority to Operate (ATO).

FedRAMP ATO is mandatory for any cloud service provider serving the Federal government, and VMware’s offering is now generally available to U.S. government and defense organizations.

VMware vCloud Government Service, an infrastructure-as-a-service hybrid cloud, is based on the VMware vSphere platform used by cabinet level agencies, military services, the Department of Defense, and the judicial and legislative branches of government.

“VMware vCloud Air is gaining rapid traction across the globe as IT departments see what can be achieved with true hybrid cloud,” said Andy Tait, VMware ‘s head of public sector strategy. “Many government services — such as passport processing or tax returns — have significant peaks and troughs throughout the year; yet have stringent data security requirements all year round.”

“vCloud Air enables the public sector to keep business-critical applications in-house and also take advantage of the enormous scalability of public clouds securely, without having to invest in new tools, skills or under-used on-premises capacity,” Tait added.

vCloud Air provides application interoperability with no changes to current applications, making the process of extending existing applications to the cloud easy. vCloud Air supports more than 5,000 applications and over 90 Operating Systems certified to run on vSphere – the largest supported base available in any cloud service.

vCloud Air also offers high availability and  consistent performance of packaged applications, whether they run onsite, offsite or across a combination of both. vCloud Air comes with interoperability needed to ensure application services run reliably across onsite and offsite infrastructure, using the exact same tools and processes.

vCloud Air extends security and performance of current onsite VMware environment to cloud-based application services, while adding capacity, allocating resources, and managing timing of upgrades and patches without having to worry about application support, interoperability and IT training. vCloud Air provides the perfect destination to deploy existing applications in the cloud while leveraging existing IT investments.

Red Hat launched Tuesday its new Cloud Innovation Practice, a global team of experts that will assist companies with more quickly on-ramping to the cloud, particularly vendors adopting Agile and DevOps best practices. The Red Hat Cloud Innovation Practice is designed to help users reduce project risks, expedite delivery, and manage the lifecycle of individual solutions.

 

Formed out of the storage virtualization vendor’s acquisitions of Ceph storage system provider, Inktank, and cloud computing services provider, eNovance, the Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through these acquisitions. Both companies provided the highly specialized skills, solution sets, methodologies and product expertise required to launch a truly forward-looking cloud practice capable of helping companies manage the complexities associated with the cloud.

 

With an eye on helping customers increase their business agility and ability to quickly adapt to changes in their environment, the Red Hat Cloud Innovation Practice can define and develop standard operational procedures, methodologies and governance for cloud and DevOps strategy development and deployments across Red Hat’s current portfolio of integrated products and services, including Red Hat Enterprise Linux OpenStack Platform, OpenShift by Red Hat, Red Hat CloudForms, and Inktank Ceph Enterprise.

 

Additionally, the Red Hat Cloud Innovation Practice will aggregate product feedback and provide the technical leadership, evangelism, best practices, research, support and dissemination services needed to create relevant use cases outlining ways to help increase delivery efficiency, speed up deployments, and achieve a quicker return on investment.

 

Last week, Red Hat announced OpenShift Commons, its open source community initiative to collaborate and deepen engagement with OpenShift, Red Hat’s open source Platform-as-a-Service (PaaS) offering, and the open source technologies that OpenShift is built upon.

 

OpenShift by Red Hat adopts numerous open source technologies, including OpenShift Origin, Docker, Kubernetes and Project Atomic. OpenShift Commons brings together these communities and is designed to facilitate sharing of knowledge, feedback and insights into best practices across the OpenShift ecosystem and enable collaboration on the dependencies that can best advance open source PaaS.

 

OpenShift Commons operates under a shared goal to move conversations beyond code contribution and explore best practices, use cases, and patterns that work in today’s continuous delivery and agile software environments. For companies not yet deploying OpenShift, OpenShift Commons can help connect them to large scale delivery experts in the context of other common open source projects, including Docker, Kubernetes and Project Atomic.

 

There is no Contributor License Agreement, code contribution requirement or fees to join, just a commitment to collaborate on the new PaaS stack.

Global flash storage vendor SanDisk debuted Sunday its iNAND 7132 embedded storage offering for use in mobile devices, whose storage architecture allows 1Gb per second or higher data transfer speed on demand. The company also introduced its high endurance microSDXC memory card built to withstand up to 10,000 hours of full HD video recording, and its 200GB SanDisk Ultra video monitoring microSDXC UHS-I card, Premium Edition to give mobile users the ability to capture, save and share photos, videos and other files without worrying about storage limitations.

 

Samples of the iNAND 7132 storage solution are currently available to customers in capacities up to 64GB. The SanDisk microSDHC/microSDXC cards come with a two-year warranty and are priced at an MSRP of US$84.99 and $149.99 for the 32GB and 64GB capacities, respectively. The cards will initially be available only in the United States, and select retailers in Europe and South Korea. The 200GB SanDisk Ultra microSDXC UHS-I card, Premium Edition, features a ten-year limited warranty and will be available globally in the second quarter at an MSRP of $399.99.

 

The iNAND 7132 storage solution features SanDisk’s new iNAND Accelerator Architecture with SmartSLC technology, a storage architecture that responds on-demand to mobile users’ needs and boosts experiences for data-intensive applications. iNAND 7132 storage solution is supported by advanced simulation, trouble-shooting and engineering tools that enable mobile manufacturers to quickly and easily integrate the device into mobile device designs, thereby significantly reducing time from product development to product availability.

 

Available in capacities up to 64GB, iNAND 7132 storage solution enables original equipment manufacturers (OEMs) to introduce a new generation of high-capacity smartphones, tablets and connected devices that improve user experiences.

 

iNAND 7132 storage solution is built with SanDisk’s 1Y nanometer 3-bit-per-cell (X3) NAND flash storage. When combined with the drive’s SmartSLC technology, this storage solution offers near single-level-cell performance when the user need demands it, with the storage solution providing sequential write speeds of up to 1Gb per second and beyond. It also brings performance to more data-intensive business, video, photography and mobile gaming applications, as well as supports 802.11ac and 802.11ad network standards.

 

The speed of the iNAND 7132 storage solution enables device makers to push the design boundaries of smartphone photography and video functionality. When used in optimized smartphones, the performance of the iNAND 7132 storage solution supports professional grade digital photography capabilities, including image capture in RAW format, thereby expanding the possibilities of image capture and processing. In addition, 4K Ultra HD video can be captured and played back with ease.

 

Using NAND flash and systems technology capabilities, SanDisk developed the microSDXC memory card using a proprietary technology and process to enable high-intensity recording. Through rigorous testing, the new technology has proven itself to enable the company to reach a new milestone — a 64GB microSDXC card that allows consumers to write and rewrite up to 10,000 hours of Full HD video recording.

 

The SanDisk high endurance video monitoring microSDXC/microSDHC cards are targeted at dash cameras and home video monitoring systems, ensuring drivers and homeowners have reliable video evidence at their fingertips. The card is also available in 32GB capacity, which can withstand up to 5,000 hours of full HD recording.

 

With the new SanDisk microSDHC/microSDXC cards, security-minded users need not worry about their card sustaining long-term, continuous recording scenarios. These cards have also been designed to weather the elements. Whether out in the rain or snow, or sitting on the dash of a hot car, users can expect seamless performance. The cards have been tested and proven to be shock and water proof, so they deliver in even the harshest environments.

 

Ideal for Android smartphone and tablet users, the 200GB SanDisk Ultra microSDXC card combines high capacity and quick transfer speed of up to 90MB/s to deliver premium performance.  At this transfer speed, consumers can expect to move up to 1,200 photos per minute.

 

Through SanDisk’s updated Memory Zone app users will more control over their mobile device’s memory storage. In the Memory Zone app, users can engage the OptiMem app feature to keep a check on the phone’s memory levels to inform users whenever the internal memory falls below a user-defined threshold. Once this threshold is reached, the OptiMem app feature will automatically transfer some of their old photos and videos to their microSD card, leaving them with more internal memory to continue making memories.

 

The app, available for free download from the Google Play store, is compatible with most Android-powered devices and allows users to locate, organize, transfer and backup data.