Windows Server 2003 retirement details and some server migration tips!

Microsoft has officially announced on July 14th, 2015, that its extended support for Windows Server 2003 has ended and all those computing platforms working on the said OS should go for a migration to the next version of the company’s Operating System. If not, the server and the clients connecting to Windows server 2003 will be vulnerable to cyber criminal activities.

In order to make the migration from Windows server 2003 to Windows Server 2008/2012 simple, here are some tips to help-

Consider Windows Server 2012 R2 – Microsoft has confirmed that its support for Windows Server 2012 R2 will be till 2020. The “Mainstream Support” for the older Windows Server 2008 version has already ended and customers are now getting “Extended Support” by Microsoft for the said version. This means it is just a matter of time until the migration from the 2008 to a newer version is necessary. Therefore, it makes true sense, to go for a Server 2012 upgrade, instead of a Server 2008 OS version.

Make a list of applications and workloads – Before the migration, always make sure that you have the complete list of applications and workloads which need to be supported by the new server OS. Also ensure that you compile the list of all software components that may/can get affected by the migration to a newer windows server version, including drivers. Especially, the Windows admin need to check if the new OS provides later versions for the server hardware, or used peripheral devices. Also keep a tab on whether your on-premises peripherals such as printers, all-in-one devices or scanners will support the new server OS or a new one has to be purchased.

Develop migration scenarios for different server types – For active directory and exchanger server IT, admin needs to take special measures especially in terms of the sequence of the system migration.

What about hardware- It is a known fact that Windows server 2003 will run on a 32 bit processor. But for Windows Server 2012 R2 and Windows server 2016, a 64 bit CPU will be needed. So, admins need to consider exchanging server platforms that are older than three years. New systems will also support more recent hardware components such as SSDs and provide enough performance resources for upcoming Windows Server generations and virtualization projects. Also, cut down in power consumption and heat generation can be observed, by deploying newer components.

Backup of Windows server 2003 will be essential – This offers a track back opportunity, if in case any issues with the migration occurs. Therefore, admins can restore data, operating system, applications and hidden partitions if the migration fails, even on new server hardware.

Convert backups into virtual machines – By converting backups into virtual machines, they are two advantages. Firstly, the virtual windows server can be used as a reference point for reinstallation. This can prove highly indispensable as Windows Server 2008 and 2012 R2 will include new features as dynamic access control which can cause issues with older applications. The other point is that the VM provides the possibility for a temporary roll back to the older server version and that can prove useful, if some kind of major technical issue surfaces in the migration process. As the IT department can witness business continuity and can spend the time focusing on error analysis and solutions.

Accelerate the migration with a master image – By the presence of a master image, a new OS, basic applications and settings can be deployed simultaneously to multiple server systems.

Carefully plan the data migration – The deployment of an image back-up ensures that all data for the migration are secured. Occasionally it happens with Exchange 2003, for example, that SSL certificates are forgotten and are not part of the back-up.

Finally, Windows Server 2003 which has been here for around 12 years is now outdated. So, in order to keep up pace with the latest technology, a migration to a newer version is essential, as it helps in keeping your server environment safe and secure.

If, in case, your IT team needs support in Server migration like selecting the best server OS as per your current IT needs, migration of applications, data, software and an hardware upgrade if needed, then do approach DNF Professional Services.

DNF Professional services, a business unit of mighty IT firm Dynamic Network Factory will give your office needs a complete makeover to your server environment. A team of experts will review your current server environment, assess what you have and what is missing and then start the process of technology selection. The selection will be done by keeping factors such as data migration, server virtualization, performance assessment & optimization, server consolidation, storage consolidation, data security, disaster recovery, business continuity and remote management in mind.

The total upgrade and migration pains are taken by DNF Professional, a subsidiary of Dynamic Network Factory in order to ensure its customer a smooth and hassle-free transition.

The service doesn’t end with the server migration process. If the customer likes and feels to have the service extended, DNF Corp can also maintain and support the services in full stream.

So, feel free to call 510.265.1122 or click on DNF Professional Services.

Posted in DNF Storage | Leave a comment

StoneFly IP SAN makes business continuity and customer compliance simple for Nancy Specialty Foods!

Nancy Specialty Foods which produces around 35 tons of high end, fine foods daily from California to throughout North America is using StoneFly IP SAN to makes its business continuity and customer compliance needs simple. The IP SAN of StoneFly is making Nancy Specialty Food’s IT department ensure high-availability of mission-critical data ranging from email and business intelligence tools to vital Enterprise Resource Planning (ERP) and Electronic Data Interchange (EDI) applications that support the said company’s supply-chain management strategies.

The technology foundation at Nancy’s is reinforced by state-of-the-art solutions from Cisco, Intel and Microsoft. More than 100 end users connect to crucial data, which is stored on 25 servers running Linux as well as Windows 2000 and 2003. While the primary data center is located at the 86,000 square-foot headquarters’ facility, the team also is responsible for managing a secondary onsite facility as part of a rapidly expanding disaster recovery initiative.

Traditionally, the company’s IT team implemented Direct attached storage devices along with hot spare disk systems on each server, totaling 50 storage devices and nearly one terabyte of data that required over the night backups and admin support.

However, as the company’s IT operations expanded, the Nancy Specialty Foods was in the process of strengthening its business continuity plans while preparing to embark on a large customer compliance project.

With that in mind, the proactive IT team started exploring ways to streamline storage provisioning as well as bolster backup and recovery needs for important company and customer data.

The biggest challenge faced by Nancy was its inefficient storage utilization with its Direct attached storage approach. And from Disaster Recovery standpoint, they wanted to separate the company’s storage from its servers while managing and provisioning it centrally. Even they wanted to get rid of the practice of having 10 hours of nightly backups which spread across networked servers and then the practice of offloading the data to a centralized tape library.

While exploring alternatives, Fiber Channel based SAN Technology came in their way. But the expense part of deploying it, made the IT department keep it as a last option. They came across, iSCSI, which was perhaps an Internet Protocol-based SAN which was proving as a more cost-effective solution. So, they opted for an IP SAN (iSCSI storage) and started to search for companies offering it.

Nancy’s IT department evaluated different IP SAN offering vendors while also working with Microsoft on a rapid deployment of Windows Server 2008 and Active Directory. To that end, they initiated a hands-on evaluation of leading IP SAN solutions from StoneFly, which is a wholly owned subsidiary of Dynamic Network Factory.

StoneFly IP SAN was chosen because of its flexible, modular architecture. Moreover, it also separated the provisioning and management software from the actual disks. Similar to a software defined storage support.

Before deploying the StoneFly IP SAN, however, Nancy’s technology team put the product through its paces during a month-long, proof-of-concept stress test. Using a Microsoft utility to simulate a 100-user, clustered Exchange environment, they reviewed traffic flow, response time, ease of use and reliability. As expected it passed out with flying colors as it met all the evaluation criterion of Nancy Specialty Foods without a hiccup.

With the IP SAN deployment, Nancy’s was able to reduce its administrative overhead while enabling its constrained IT team to meet increasing storage requirements without adding more staff. The company also is planning to deploy disk-to-disk (D2D) backups to enhance overall business continuity further. With Nancy’s primary and secondary mirrors, reflection readily protects the company’s data in case they experience an outage at either data center.

To provide an extra measure of protection against data corruption, however, the team plans to implement a third SAN for handling D2D backups while eliminating backup window restrictions completely. It can also go for a StoneFly DR365 in this approach.

To know more details about StoneFly IP SAN call 510.265.1616 or click on StoneFly IP SAN web page.

Posted in DNF Storage | Leave a comment

New IP Video Surveillance Systems offer litheness!

Video Surveillance is becoming a key component of security systems in long-term care facilities. In this security systems segment, surveillance cameras are placed in

  • Parking lots, as defense against theft and vandalism of vehicles
  • Public areas such as TV rooms, hallways, dining halls where residents and staff gather up
  • entryways and at reception areas, to provide images of those coming and going out
  • Stockrooms and supply closets, to monitor the thefts

At the same time, a new practice of installing video cameras in resident’s living quarters is also receiving positive reviews. This is because, security cameras installed in these places, will give a way to monitor their loved one’s condition 24/7 and keep an eye on the quality of their care. Two-way video systems with audio are also often used to facilitate interactions between residents and their families, more often than visiting schedules would permit.

And here’s where an upgrade is needed:

  • As useful as they have proven to be, CCTV have some drawbacks as well. First and foremost is that they require hard wiring and installation by licensed professionals. That can prove hard on the budget and so can limit the cameras to be installed.
  • The other drawback of a typical CCTV system installation is –images- must be stored on tape, which is proving expensive in terms of management and monitoring time, as well as space. Though, new analog systems are coming with DVR compatibility, where disc space is available for recording, old CCTV systems already installed in premises from the past few years do not support recording to DVR. So, an upgrade is surely needed in this instance.
  • The third drawback of CCTV is that systems can be monitored in real-time only from locations that are wired to the cameras.

What’s the solution?

Here’s where IP Video Surveillance cameras offer a newer and better alternative to CCTV. Technically speaking, IP stands for Internet protocol and since surveillance is being carried out over IP, it means IP video surveillance. Wireless IP cameras greatly simplify installation, even when a system consists of many cameras. Wireless IP systems will work wherever there’s access to Wi-Fi or a network router. Video data can be stored on a hard disk or in the cloud and can be accessed from virtually any location on a PC, tablet or smart phone. But like said, every technology has its own pros and cons; wireless surveillance do have their own set of troubles.

Centralized vs. Decentralized systems

When considering an upgrade from a CCTV/Analog system to IP surveillance systems, the first thing to consider will be to determine whether a centralized or decentralized system of IP cameras will better meet your needs?

In a Centralized IP system arrangement, all you have to need are cameras, recording software, a dedicated PC/server, attached storage (can be a NAS/IP SAN), housings to protect the cameras and a network—wired/wireless.

The above set of hardware centralized IP surveillance can be carried out in the following way. Cameras perform function such as video recording, basic analytics and event triggering. Alarm management, storage management and video processing are handled by a central PC that runs on licensed software. Recorded video is processed and sent to the attached storage device.

Although, centralized systems may seem to make sense for video security in an institutional setting, they have some real disadvantages, including:

  1. All video is processed through a video server and if it goes down, the whole objective can fall into jeopardy. However, in today’s world, techniques such as redundancy, failover can solve this issue.
  2. Licensing fees for software are usually charged on a per-camera basis, in addition to charges for the server management software license (usually along with an annual maintenance fee).
  3. Even though cameras for centralized systems cost less than decentralized systems, the additional costs for software licensing, a central server and maintenance bring the costs higher than those for decentralized systems.

In decentralized security systems- all you need to have are cameras, lenses, video storage, and a network. Here, in this arrangement the cameras will have the whole processing and analyzing intelligence installed in them, along with software recording capability. Some also have video storage capacity which can server for temporary purposes. Many have VoIP (Voice over Internet Protocol) functionality that enables the camera to send and receive calls from any kind of phone. Although IP cameras for decentralized systems are a far more expensive than those used in centralized systems, they offer all the functionality needed without as many peripherals and licensing fees, so they can prove more economical in the long run.

The advantages of decentralized IP surveillance systems include:

  1. Each camera operates independently, so there is no central point of failure. But when the camera fails, the surveillance system concept can take a bad hitting.
  2. Each camera can record to its own integral storage device at the camera — SD card or external hard disk — or to a central storage unit.
  3. If a camera loses connectivity or there is a storage device failure, it will continue to buffer data until the issue is corrected.
  4. Other cameras in the system can be alerted to a failure and programmed to notify you via email, text or phone call with a prerecorded message.
  5. No video management software licensing costs as the software is in the camera, and upgrades are usually free.

What to look for in IP surveillance systems:

Security surveillance cameras come in different sizes and shapes, and different models offer varied resolution and functionality. As you look at the field, try to determine which type best fits the needs of your facility and its residents.

  • Some cameras offer pan/tilt/zoom (PTZ) functionality, which can enable greater coverage in indoor and outdoor areas.
  • Megapixel IP cameras (H.264 compression) offer much better image resolution than VGA cameras.
  • Wireless IP cameras are much easier and economical to install than hardwired systems, but the installation site must be within Wi-Fi range of a router or hub. Signal availability and bounce may make you paranoid.
  • For hard-wired installations, consider PoE (Power over Ethernet) IP cameras — to which power can be supplied via Ethernet cables — to reduce the number of power sources. This makes sense for fresh cable installations.
  • Consider cameras with built-in microphones and speakers for two-way communication, where needed.
  • Motion detection/event-triggering functionality can reduce the bandwidth and storage requirements for recorded video.
  • Alarm functionality can alert security when a camera records unusual activity.

Video storage influence on IP Video Surveillance;

With a demand for high resolution images increasing, people who are shifting from analog to IP or normal CCTV to IP are showing more interested in going for the best. That is why the demand for cameras offering detailed video evidence is increasing. Moreover, as per the legal rules prevailing in most parts of America, video evidence with much more clarity is attaining more importance, than the one which lacks it. Hence, cameras which offer high clarity images are generating lots of data. Therefore, to store all that generated data for future use, an efficient video storage with capabilities such as RAID and fault tolerance is turning vital.

So, while planning for a migration from Analog to IP, please give equal importance to your video storage needs, like you do for the cameras and other components in the architecture.

Otherwise, it just doest make sense to have sophisticated cameras on premises minus an efficient video storage.

Posted in Big Data, Data Storage Software, datacenter, Video Surveillance | Leave a comment

StoneFly IP SAN makes world’s largest broadcaster get rid of DAS for blade servers and new applications!

Capitol Broadcasting Company (CBC) is known to be world’s largest broadcaster, as it owns five television stations, a radio station and a radio network. This Raleigh based satellite provider is relying on StoneFly IP SAN to get rid of Direct Attached Storage for blade servers and new apps in its IT environment.

Going with the history, CBC’s WRAL-TV was the first station in the nation to transmit public broadcasting in high-definition television (HDTV) and the first station in the world to air an entire news broadcast using HDTV. So, with such humongous capacity of content to deal with the company’s data storage requirements outpaced its existing direct-attached storage (DAS) resources within no time. Additionally, the company starts to face another storage challenge of embarking upon a plan to implement a new email- archiving system.

CBC’s WRAL-TV was consuming around 850GB of local disk for email-archiving alone in less than two months. So, the company had only had an option to add more disks into the fray or go for a more sophisticated approach by adding a SAN appliance, which would allow the IT team to take storage and allocate it across several systems.

While e-mail archiving was the catalyst to move beyond DAS, other looming IT challenges which the company faced were also driving the need for more robust storage, including an IBM Blade Center initiative, and the desire to upgrade to a more effective backup system.

With a go-ahead from upper management, the company’s team of system engineers began surveying the IP SAN landscape. They looked at NetApp’s IP SAN offerings, and found to be prohibitively expensive. They also evaluated solutions from other leading market players, such as StoneFly, Inc. and LeftHand Networks… In the end, they determined that StoneFly’s all-inclusive solutions offered the most SAN for the money, with built-in Snapshot and replication capabilities to prevent against the loss of critical data, as well as centralized storage management, control and monitoring of logical storage volumes.

CBC chose dual StoneFly Integrated Storage Concentrators (ISC) to support its wide-ranging storage endeavors. Configured for CBC as an active-active cluster for load balancing and complete redundancy, the ISC systems offer all the scalability needed for the company’s expanding IT requirements. As the cornerstone of StoneFly’s IP SAN product family, which has been shipping since June 2002, all StoneFly ISC systems are designed from the ground up to support next-generation storage technologies including SAS, 4GB Fibre Channel and 10GB iSCSI, as well as large-scale IP SAN deployments.

In addition, they are capable of reutilizing direct attached storage resources, as they work in conjunction with StoneFly’s StoneFusion Storage Virtualization Technology.

StoneFusion incorporates Snapshot capabilities for instantaneous data recovery; block-level virtualization for increased storage utilization and capacity provisioning; and a comprehensive range of storage services such as clustering, storage consolidation, access control, volume management, and synchronous and asynchronous mirroring.

CBC gained the following benefits with StoneFly IP SAN deployment:

  • The company was able to retrieve archived items quickly and easily.
  • At the same time, the StoneFly SAN has also made a dramatic impact on CBC’s demanding backup requirements. Once required to perform tedious local backups across the enterprise, the CBC IT team now implement backups to disk using the SAN. The organization has experienced a 100 % increase in backup performance through use of this new streamlined process, cutting CBC’s backup window in half. With 6TBs of data to manage overall, the team of system engineers have designed a routine that entails full e-mail backups on a nightly basis and differential backups at regular intervals for other aspects of the enterprise.
  • CBC was able to further leverage the new SAN to support WRAL’s main file server, which failed in the midst of the new e-mail archiving project. The IT team assigned another volume to the SAN and hooked it up to a new server, eliminating the need for a hard drive.
  • With its e-mail archiving and backup challenges behind it, CBC began implementing a long-awaited plan to build a new IBM BladeCenter to accommodate a critical new document management system. The high-density Blades, which consume less power and require less cooling than traditional servers, were installed in a 14-server chassis. Because IBM offers BladeCenter systems with or without built-in storage resources, WRAL chose to consolidate storage resources efficiently by ordering the Blades without disks. The company was then able to centralize storage to the SAN, while using it as the operating system hard drive for the Blade- Center. This allowed CBC to implement a scenario in which the diskless Blades boot directly from the SAN using a QLogic iSCSI HBA.
  • With the StoneFly IP SAN supporting the BladeCenter, CBC was able to eliminate costly disk expenditures for the servers, while also gaining an added measure of redundancy. The flexibility of the SAN is such that, if one blade self-destructs, a replacement blade can be installed easily – without data loss.

Finally, the IP SAN is ready and waiting to support additional requirements at a moment’s notice, including a plan for a centralized CBC-wide email system. On an overall note, StoneFly IP SAN allows us to expand the company’s storage on the fly, enabling us to keep pace with the ever-changing storage demands of this dynamic organization

To more call 510.265.1616 or click StoneFly IP SAN.

Posted in Big Data, Data Storage Software, datacenter, IP SAN, iSCSI, StoneFly | Comments Off

The damage caused by idle severs in worldwide data centers is estimated to be $30 billion

Ten Million servers are said to be sitting idle in data centers located worldwide and this was discovered in a recent survey made on data center efficiency by Anthesis Group in association with Stanford University. And out of the said number 3.6 million servers are said to be comatose in United States alone.

It is estimated that these 10 million server count of comatose (idle) servers is bringing $30 billion unproductive loss to data center capital when an average server cost of $3,000 is taken after ignoring infrastructure capital.

These findings support the idea that ongoing measurement and management of companies’ IT infrastructure is needed to optimize performance, energy use and ROI.

The survey found that by cutting down idle servers, data centers operating worldwide can save 4 Gigawatts of power in global IT loads. Displaced power could then support new IT loads that deliver business value and innovations.

The survey also revealed that improper planning while procuring servers is leading to the raise in count of comatose servers (idle servers). Servers remain on because it’s not known what they’re doing at any given time. People figure that if they are not useful now, they will probably be used at some time some other day. But years can flow without the need to process transactions and this is leading to the increase in count of idle servers.

With server analytics, companies can determine whether it is more cost efficient to run a transaction in their own data center or in the cloud. For instance, if you only do payroll bimonthly, it may make sense to contract the farm and offload background transactions to the cloud when you’re not doing payroll.

There is a general notion that having servers in premises will make things look more secure, than procuring them as a computing service and this where huge capital investments do nothing.

In order to counter such troubles, data center managers should take the help of solutions such as server virtualization; data center infrastructure management and make a note of upstream traffic or user access information per server from central IT management, virtualization and workload distribution systems. This identifies servers which are not working as per the requirement and can help in decommissioning them without adding more risk to business.

Using servers only when needed does not just save power but also frees up capacity and results in fewer licensing fees for software. Therefore, collocation can act as boon in such cases.

With smart apps related to server analytics, one can keep a tab on how apps are broken down and track how they’re processed on computers and when.

And to get more such suggestions on what your current data center is generating and how many servers in your server farm are idle – approach Dynamic Network Factory which has deep knowledge in cabling and infrastructure, power and cooling, rack layout and design, security and management practices.

All of these elements, put together in the right way, can make for long-lasting IT systems and true operational efficiency for your company.

Give a call to 510 265 1122 or click on DNF Corp web page and fill in your details to get a call back from them.

Posted in Big Data, datacenter | Comments Off

Integrators and End users express increasing concerns over surveillance video storage!

Video Surveillance is proliferating in organizations both big and small around the globe. But at the same time, it is also mounting concerns from system integrators and end users about how they are going to manage this data influx in the months and years to come.

According a research commissioned by hard drives maker Seagate which mainly included integrators and IT executives in the study; over 74% of respondents said that the number of surveillance cameras being used will increase in next couple of years. Additionally, due to the demand for video analytics in the said period, the strategic value of video surveillance will also increase to many folds.

The Video Surveillance Trends Report presented by Seagate included respondents from countries like US, UK, India, China and Brazil and pressed in respondents from business verticals such as manufacturing, banking, financial services, and transportation, technology and retail sectors. The report clearly stated that most organizations belonging to the said regions were using a minimum count of 200 security cameras that are running 24/7. The said number was even higher in US and UK regions, where the medium number of security surveillance cameras was reported to be around 349. Moreover, over 34% of those surveyed said that they have significantly increased their number of surveillance cameras over the past 12 months, due to the advantages they offer, in addition to the ability of them to deter crime.

Craig Carmichael, market research analyst for Seagate, said that while they did expect to see above average growth for video surveillance data compared to the growth typically seen in the data storage market, which is usually somewhere between 20 to 40 percent, they were somewhat taken aback by the breadth of the demand for video.

Aubrey Muhlach, surveillance segment marketing manager for Seagate, said that people are also discovering how valuable this data can be, especially when advanced analytics are applied to it.

Hence, these two reasons can be termed as strong points for people retaining video for longer periods of time than they once did.

According to the Seagate study, 27% of respondents said that they are keeping video footage for a year or even longer and 23% of them admitted that they were retaining the recorded video evidence for a period of 90 days. Another 14% of them said that they hold onto footage for between 60 and 90 days, 23% keep it for 30 to 60 days and only 11% save it for less than 30 days.

Therefore, all this desire to store surveillance videos for longer periods of time could trigger immense burden on the drives storing the data.

“One of the biggest costs as people are building out their surveillance system is, in fact, the storage. One thing that we have found is a lot of people are using the cheapest drives out there to store their data, which may seem like a nice, upfront investment but what we’re seeing is an earlier failure rate of those drives,” explained Muhlach.

Mo Tahmasebi, CEO and President of DNF Security, a business unit of Dynamic Network Factory and which offers video storage solutions to mission critical video surveillance applications, strongly felt that most users of surveillance are missing the logic behind the normal and surveillance hard drives differentiation. As a result, they are typically taking desktop-class drives, plugging them into their surveillance related video management systems. But those drives aren’t built to work with these massive numbers of cameras at these high streaming capabilities – and most importantly are not built to run all day & every day.

Given that increasing camera counts will also generate more data, there were also mounting concerns among respondents about how they are going to adequately store and maintain this footage. When asked what their organization’s challenges were with using their existing primary storage media for storing surveillance footage, 47 percent said maintenance, while 44 percent said capacity. Other challenges noted by respondents included: data recovery @ 44 percent, reliability @ 40 percent, speed @ 40 percent, and cost @ 38 percent.

And a majority chunk of respondents- say 87%, said that video surveillance is becoming more challenging to manage and 94% reported that they will receive increased infrastructure investment for it. Also, the study found that most respondents are using traditional storage solutions, as well as some form of the cloud to store mission critical surveillance related video footage.

Now, the other truth which is hidden behind these concerns is that in coming years, the craze for megapixel or HD cameras will increase to many folds and wonder how integrators and users will start handling the video storage issues from then on.

Posted in Big Data, dnf security, Video Surveillance | Comments Off

United States and China are world’s biggest producers of electronic waste!

United Nations University recent study claimed that United States and China were the world’s biggest producers of electronic waste (e-waste). And among the generated electronic waste by the two nations most of it constituted microwaves and dish washers.

The study also confirmed that over 41.8 million tonnes of e-waste were dumped around the globe in 2014 and only an estimated 6.5 million tonnes were taken for recycling. Buried within the 41.8 million tonnes of waste was more than 16,000 kilotonnes of iron, 1,900 kilotonnes of copper and 300 tonnes of gold as well as other precious metals such as palladium. The report estimated that the discarded materials, including Gold, silver, iron and copper were worth $52 billion.

The United States led e-waste dumping with 7.1 million tonnes in 2014, ahead of China with six million and followed by Japan, Germany and India, it said. Canada ranks lower on the list, in 15th place, with a dump of 725,000 tonnes.

The United States, where individual states run e-waste laws, reported collection of one million tonnes for 2012 while China said it collected 1.3 million tonnes of equipment such as TVs, refrigerators and laptops in 2013.

Norway led per capita waste generation, with 28.3 kg dumped per inhabitant, followed by Switzerland, Iceland, Denmark and Britain. On that ranking, the United States was ninth and China was not among a list of the top 40. Canada dumped 20.4 kg per capita.

As said in earlier paragraphs, among the e-waste generated all round the world, about 60% of the waste was made up of items that included large and small appliances, vacuum cleaners, solar panels, video cameras and electric shavers.

The other astonishing fact revealed in the UN study was that electronic makers nowadays are manufacturing appliances and electronic gadgets with shorter life span than the ones manufactured a decade ago. As a result, by the year 2018, an increase in generated e-waste over 60% can be observed.

The only solution to tackle this huge mountain of electronic waste is to stay responsible towards the environment. The United Nations study claimed that if each and every individual on the planet starts learning about the hazards of e-waste, then a lot of the electronic dump can be handed over to a responsible recycler, who thus can save the environment from the dreaded clutches of e-waste hazards.

If you are in and around California, approach DNF Recycling services. It’s a business unit of Dynamic Network Factory which involves in e-waste disposal program. It offers a one-stop solution for recycling your old equipment through hassle free 5 step process.

  1. DNF Recycling Services will arrange for pickup and transportation of the equipment to their office premises. Typically there is no fee associated with the pickup and removal of equipment. However, the distance of travel is currently limited to the state of California. If travelling is required outside of California, please call in advance at 510.342.5884 and ask for details.
  2. After picking up and transporting the discarded electronic equipment to their facility, a detailed inventor and assessment of the equipment is done and a report is generated and is provided to the related customer at no cost.
  3. Then each unit will carefully disassembled and parts will be sorted based upon their recyclable properties and whether or not they contain hazardous properties. This process is performed with full compliance from US Environmental Protection Agency (EPA) and clients.
  4. If the client provides hard disks and want the disposing process to be done in a responsible way, then DNF will erase the data from the drives using one of the three methods. The first method will be to write 0s and 1s to the drive which permanently erases traces of any data. The second method is drilling holes into the hard drive platters in such a manner that prevents further access to the platters. The third method involves sanding the platters using an industrial sandler. This is the most proven method to get the data out of the drives. All this is done for a nominal fee.
  5. The main highlight is that, DNF Recycling offers to its customers a credit for buying new components or electronics. However, the credit will depend on many factors lined-up in parallel with the benchmarks of DNF.

To know more details on what your e-waste can earn for you call 510.962.5012 or click on DNF Recycling web page.

Posted in DNF Storage | Comments Off

Hyper Converged systems can reduce technical debt in less than 200 virtual server environments!

Data centers which can be termed as “mid-sized” and running less than 200 virtual server machines in their computing environment can reduce their technical debt by switching to Hyper Converged systems in their next data center refresh. This was revealed in the latest virtual server study- Gartner small and business buying preferences survey; made by Gartner.

The world renowned IT analysts firm’s recommendation is already said to have crept into data center business worldwide, as 40% of midsize businesses expected to replace all data center servers and storage with integrated systems by 2018, up from single figures this year.

Generally, the key challenges for small sized businesses are infrastructure complexity and maintenance.

So, highly virtualized businesses with fewer than 200 virtual servers should opt for hyperconverged infrastructure.

Hyper Converged Infrastructure Systems can be significantly cheaper than best of breed infrastructure components such as rack servers or even regular SANs. Also, the simpler software centric architecture means lower running costs.

As of now, mid-market businesses spend almost 70% of their IT budgets to keep business going and so have to cope up with the technical debt associated, with maintenance, integration complexity and performance within the data center.

If all these businesses switch over to hyper converged systems, companies can free up their IT budget for things that actually create business value.

However, the in-house IT staff should also be replaced with more skilled staff who can manage the environment with ease… Will their pay not increase the costs again?

Posted in StoneFly, Video storage platform, virtual servers | Comments Off

More than 28 billion gigabytes of storage shipped in early 2015!

International Data Corporation (IDC) released a latest report in which it mentioned that worldwide data storage hardware sales witnessed a boom in early 2015. As per the report, it is expected that more than 28 billion gigabytes of storage were shipped in the first two quarters of 2015.

The report “IDC’s worldwide Disk Storage Systems Market” also confirmed that the revenue grew 6.8% year over year during the said period for worldwide enterprise storage systems as it reached the mark o $8.8 billion.

IDC expects that the overall revenue of these vendors will reach the mark of $23.6 billion, on going with the current growth curve.

Spending on traditional external arrays fell during the quarter while demand for server based storage and hyper-scale infrastructure was up strongly.

Note: Hyper scale storage computing was introduced by Facebook and Google few years back and refers to distributed infrastructures that support cloud and big data processing and can scale to thousands of servers.

The largest revenue growth occurred in the server market, where internal storage sales were up by 23.3%. Internal storage sales benefited from healthy server sales and not just upgrades to existing server infrastructure. Storage systems that were sold directly to hyper scale or internal cloud data center users accounted for 12.6% of total spending in the quarter, up by 22.9% year over year.

External disk storage system revenue fell during the first quarter and was down by 0.6% to $5.6 billion year over year.

High-end storage array sales were up 1.3% from a year ago to $1.5 billion. IDC said the uptick represented a “reprieve from the protracted drop in sales the high-end market has been experiencing.”

Entry-level storage sales grew by 1.1% to $1.4 billion, whereas mid-range arrays, which accounted for the largest portion of the market, declined by 2.4% to $2.8 billion.

The most popular external storage arrays were all-flash models and hybrid flash arrays, which combine NAND flash with hard disk drives. All-flash and hybrid arrays drove the external storage market in the first quarter with 81.6% revenue growth ($403.1 million) and 9.1% ($2.5 billion) year-over-year growth, respectively.

Posted in Big Data, IP SAN, iSCSI, StoneFly | Comments Off

Know why backup and disaster recovery should be converged?

Backup and disaster recovery were treated as two separate entities till a couple of years ago. But now, all thanks to server virtualization, these two technologies are offered as converged solutions. Follow the article further to know on how is this being made possible

Earlier, backup was about making a recoverable copy of data and disaster recovery was concerned with business continuity, where the idea was to move workloads to new hardware and remote locations in the event of a major disaster.

But as most organizations are now heavily virtualized, a degree of flexibility is obtained, in order to unite these two technologies into one single and easy to manage solution. Nowadays, hypervisors are delivering new capabilities, so that backup vendors have design features that exploit the portability of virtual machines.

To get a clear understanding, consider the instant recovery feature found in many of the newer backup applications. Earlier, restoring a server from backup used to take hours and even days, to complete. But now, with the presence of instant recovery feature, the hypervisor and disk based backup allows the backup copy of a Virtual machine to be brought backup to life almost immediately after downtime.

With the backup virtual machine copy an organization can use the data file as it would use the production virtual machine. A traditional restoration is still eventually required, but it occurs in the background after the backup VM has been brought online.

This serves as a great example as how backups and DR are converging. The backup application is still making point-in-time copies of the VMs, but instant recovery capabilities have reduced the recovery time objective to levels that were previously only attainable with the help of expensive availability features such as failover clustering and virtual machine replication.

StoneFly Backup & Disaster Recovery series of appliances do the same. These appliances offer backup and disaster recovery features in a single easy to manage solution in order to serve all the needs pertaining to physical, virtual server and workstation needs. So, manage all of your backup operations for your datacenter or office with a single central management console.

The backup engine automatically creates backup images of physical servers based on flexible user-defined policy. These images can be restored (bare metal recovery) to the same hardware, to dissimilar hardware to build a new server, or can be mounted as a drive to retrieve an earlier copy of a specific file, folder, etc.

Every backup can also automatically be converted into a Virtual Machine. This feature is quite useful for business continuity for your production environment if a physical server or workstation goes down and needs to be repaired. Replica Virtual Machines can also be used for any testing the user might choose including non-invasive compliance testing.

Posted in Big Data, Data Storage Software, Disaster Recovery, iSCSI, StoneFly | Comments Off