May 1st, 2013
10:03 am

Posted by

SmarterComputingBlog
SmarterComputingBlog

Couldn’t join us live for the first installment of Smarter Computing’s What’s Next for IT debate series? Catch up now!

The first debate explored the question “Does infrastructure really matter when it comes to cloud?”.  Moderator David Linthicum, a noted enterprise technology expert with InfoWorld and Blue Mountain Labs, was joined by leading cloud experts who discussed whether infrastructure design and components are important in cloud deployment, infrastructure management and other important topics. Click the button below to view the debate, and stayed tuned for the next installment.

 

Here’s more information on our moderator and panelists:

 

David Linthicum – Moderator

Leading technology publications frequently name David S. Linthicum among the top 10 enterprise technologists in the world. He is a true thought leader in the industry, and an expert in complex distributed systems, including cloud computing, data integration, service oriented architecture (SOA), and big data systems. As the author of over 13 books on computing with over 3,000 published articles, as well as radio and TV appearances as a computing expert, he is often quoted in major business and technology publications. In addition, David is a frequent keynote presenter at industry conferences, with over 500 presentations given in the last 20 years.

 

Frank De Gilio: IBM, STG Chief Cloud Architect – Panelist

With over 30 years experience, Frank is focused on providing enterprise wide cloud solutions to clients who are interested in leveraging their IT to be more cost effective and agile. His unique approach looks at an enterprises holistic requirements on cloud, uniting the development, operational and business aspects of the cloud deployment model to ensure that a business is looking at all of the implications of implementing the technology.

 

Larry Carvalho: Principal Consultant at RobustCloud LLC – Panelist

Provides strategy and insight into the adaption of Cloud Computing technologies. He provides advisory services and works closely with customers and vendors to help all parts of the ecosystem understand cloud computing, map business goals and objectives, review and fine tune strategies, identify benefits and return on investment, and more. He brings extensive experience in crafting and delivering innovative solutions to the enterprise market.

 (13858) Blog Views

LinkedInTwitterFacebookShare

Posted by

SmarterComputingBlog
SmarterComputingBlog

David Linthicum, leading enterprise technologist, shares his thoughts ahead of the What’s Next for IT debate.

I’m honoured to be participating in a debate sponsored by IBM entitled: IBM Smarter Computing’s What’s Next for IT Debate.  This will be a frank and honest discussion about cloud, data and security topics that are top concerns for business and tech clients.  The debate will be moderated and led by expert IT influencers.

This debate will ask the question: “Does infrastructure really matter when it comes to cloud?”

The core issue around cloud and infrastructure is that “the cloud” is both a provider of infrastructure, as well as a consumer of it.  Thus, we have to consider both the foundations of PaaS, IaaS, and SaaS clouds, as well as clouds that provide foundational infrastructure themselves.

The reality is that cloud computing is becoming the new platform for business.  Thus, this platform must support both a resilient and high performance infrastructure that’s able to remain in constant production, as well as provide the ability to scale.

The reality is that most of those who leverage or provide cloud services, don’t understand the value of infrastructure as it pertains to core operations.  It’s a less-than-sexy concept, which is only the only topic of conversation when there are outages or performance issues that cost the businesses real money.

Certain aspects of infrastructure are becoming even more important when considering cloud computing:

  • Resiliency, or the ability to keep running no matter what.
  • Performance, or the ability to provide core services that keep up with the demand of the business.
  • Elasticity, or the ability to expand and contract to align with the requirements of the business.
  • Security, or the ability to protect critical information and processes.

Come to the debate and discover why this is one of the most important topics to consider, as we continue to migrate into the cloud.

 

 (11936) Blog Views

LinkedInTwitterFacebookShare

April 23rd, 2013
12:05 pm

Posted by

SmarterComputingBlog
SmarterComputingBlog

Companies that insure our road vehicles request information including the driver’s age, gender (no longer legal in Europe), claims history and the ZIP or post code where the vehicle is parked at night. On this narrow data set, insurers construct an analytic model used to assess and price risk. A ruling from the European Court of Justice thinned an already sparse model: from 21 December 2012, insurers in that region can no longer vary price of their premiums based on gender.

The immediate effect has been an increase in premiums charged to women in the UK by as much as £500 as reported by the Guardian newspaper. The existing model also disadvantages the young who, as new drivers, haven’t yet had the opportunity to establish records as safe road users. Using age as its determinant, the model groups potentially low-risk drivers and reckless drivers together, and prices their risk similarly. Undoubtedly, some young people drive dangerously – and neurological research explains why. Adolescents have reasoning ability close to that of adults, but they display high rates of poor decision-making: “in risky driving scenario, teens increased risk when tested in company of friends and adults didn’t.

The current model used for vehicle insurance is imprecise: reliant on a small data set, it is ill-equipped to identify low-risk individuals within a cohort of higher-risk drivers. Further, the recent EJC ruling exposed the model’s fragility: women driving on Europe’s roads on 22 December 2012 represented no greater risk than they did on the previous day, but because the insurers’ pricing tool captured so little useful information, it responded to a legal change by abruptly increasing prices. Models capturing larger data sets would demonstrate greater flexibility.

Launched in May 2010, insurethebox pioneers a data-driven model of vehicle insurance in the UK market. The company fits telematics technology to measure how, where and when each insured vehicle is driven, and this information is delivered to a warehouse for analysis. The company’s data scientists assess drivers individually and make evidence-based decisions to price their insurance. It transpires that driving behaviours such as fast acceleration and deceleration, and taking corners at speed are better indicators of a high-risk driver than age, gender or post code.

Insurethebox further differentiate their product by feeding back results of their analyses to the data creators – their customers – via personalized, secure web portals. The company advises customers what actions they can take to reduce their risk of accident – and the cost of their insurance premium – and promotes good driving behaviours to cultivate. While useful to drivers of all ages, this information may be particularly valuable to younger drivers, who according to Professor BJ Casey, Director of Sackler Institute for Developmental Psychobiology, suggests respond better when their good decisions are rewarded rather than bad ones are punished.

Insurethebox’s use of big data and analytics to derive a better price for risk is working as a business model. Based on a survey of 1498 of its customers undertaken between June and November 2012, the company found they save on average £601 on their car insurance. According to data from the UK Government , the median gross weekly earnings for 18-21 year olds in full-time employment is £280 per week or £14,560 a year – telemetry-based insurance has the potential to save these young people more than 4% of their pre-tax income.

While privacy concerns deter some consumers from adopting telemetry-based insurance, economics makes it too attractive for young people to ignore. Feeling which way the wind blows, other companies are following insurethebox’s pioneering lead. I think we are experiencing only the beginning of a wave – data from existing in-vehicle systems will swell the telemetry stream, showing, for example, whether a driver engaged their vehicle’s indicator light signals before changing lane, then braking hard to avoid collision. Growing data volumes and deeper analyses will create further opportunity to refine the pricing of risk and coach for better driving.

In the vehicle insurance sector the older model used to price risk is limited by knowing too little. Understanding an individual’s driving behavior informs precise pricing of risk – analysing larger data sets reduces the price of vehicle insurance. Big data is creating opportunities across industries; pioneering companies will create new business models, disrupt established markets, and create value for customers and shareholders.

To learn more about how big data can help insurers, download Harness the Power of Big Data for Insurance

 

This post by Mike Kearney, Senior Product Marketing Director, was originally featured on The Big Data Hub.

 (17520) Blog Views

LinkedInTwitterFacebookShare

April 19th, 2013
9:52 am

Posted by

SmarterComputingBlog
SmarterComputingBlog

What’s next for IT? Debate Series: IT experts discuss cloud infrastructure

Smarter Computing is IBM’s approach to IT innovation in an information-centric era. It helps IT leaders seize the opportunities of a smarter planet by thinking differently about the way their organisations can:

  • Unleash innovation through the cloud
  • Unlock the power of big data and
  • Safeguard the security of critical information and business processes.

IBM will debut “Smarter Computing’s What’s Next for IT” — a series of stimulating one-hour debates on cloud, data and security topics that are being discussed daily inside business organisations.

The debates will be held monthly and will be moderated and led by expert IT influencers. They will be broadcast live using Spreecast, a socially integrated video platform.

Click on the button below to register for the debate!

The first debate will probe the question “Does infrastructure really matter when it comes to cloud?” and will be moderated by David Linthicum, a noted enterprise technology expert with InfoWorld and Blue Mountain Labs. Don’t miss this opportunity to hear from leading cloud experts. Our panelists will debate whether infrastructure design and components are important in cloud deployment, cloud infrastructure design, infrastructure management and other important topics.

Date: May 1st                             Time: 3:00 – 4:00 AM AEST

The debate will also be available to replay on demand; check back here for updates!

Participants will be able to join the conversation and ask questions during the debate, directly on Spreecast using a Twitter stream. Also, make sure to follow the twitter hashtag #SCDebate to keep track of conversation around the discussion. After you’ve registered, you’ll receive an email the day before the debate with details to join the event on Spreecast.

 

Here’s more information on our moderator and panelists:

 

David Linthicum – Moderator

Leading technology publications frequently name David S. Linthicum among the top 10 enterprise technologists in the world. He is a true thought leader in the industry, and an expert in complex distributed systems, including cloud computing, data integration, service oriented architecture (SOA), and big data systems. As the author of over 13 books on computing with over 3,000 published articles, as well as radio and TV appearances as a computing expert, he is often quoted in major business and technology publications. In addition, David is a frequent keynote presenter at industry conferences, with over 500 presentations given in the last 20 years.

 

Frank De Gilio: IBM, STG Chief Cloud Architect – Panelist

With over 30 years experience, Frank is focused on providing enterprise wide cloud solutions to clients who are interested in leveraging their IT to be more cost effective and agile. His unique approach looks at an enterprises holistic requirements on cloud, uniting the development, operational and business aspects of the cloud deployment model to ensure that a business is looking at all of the implications of implementing the technology.

 

Larry Carvalho: Principal Consultant at RobustCloud LLC – Panelist

Provides strategy and insight into the adaption of Cloud Computing technologies. He provides advisory services and works closely with customers and vendors to help all parts of the ecosystem understand cloud computing, map business goals and objectives, review and fine tune strategies, identify benefits and return on investment, and more. He brings extensive experience in crafting and delivering innovative solutions to the enterprise market.

 

 (14393) Blog Views

LinkedInTwitterFacebookShare

Posted by

SmarterComputingBlog
SmarterComputingBlog

Since Geoffrey Moore, author of Crossing the Chasm and Managing Director of Geoffrey Moore Consulting defined them, much has been written about systems of record and systems of engagement. Numerous authors describe what they see as a shift in computing between these two types of business systems. Their main thrust is often that core business applications (systems of record) like ERP or CRM and new customer-facing mobile, social, and web applications (systems of engagement) are somehow on parallel paths with little in common.

Read more on this topic from Geoffrey Moore

While that thought makes interesting headlines and spurs online debates over computing philosophy, the discussion we ought to have is how we get the most value from both types of systems. How do we ensure that IT infrastructure is ready given that many customer-facing systems are now business-critical? In the 2012 IBM Global Chief Executive Officer Study 72 percent of CEOs said they wanted to “improve response time to market needs.” For the first time since the study began, CEOs (71 percent) identified technology as the most important external force impacting their organizations. With CEOs focused squarely on gaining competitive advantage through technology, CIOs need to make the critical architecture choices that enable them to deliver results that meet that expectation.

Core business systems such as ERP and CRM and the IT infrastructure they are built on are a crucial component in responding to market needs. They deliver the database capabilities, support the operations, and provide the reliability, availability, and security that our businesses – and customers – demand.

And while we’re connected with a business through a BYOD (Bring Your Own Device) interface, we’re creating data, completing transactions, and making decisions – all of which need the level of business controls, security, and reliability we require for critical, core business systems. After all, how happy would we be if our new Nexus 4 phone never shipped because the Google Play engagement system didn’t pass the order to their ERP system for fulfillment?

In this increasingly complex environment, core business systems and consumer systems are both vital to businesses and cannot run on independent paths. CIOs and IT Architects need to evaluate their IT infrastructure in light of the need for integration of these two types of business systems and based on the growing criticality of customer-facing systems. Building a shared IT infrastructure can offer advantages for both types of systems – enhancing scale, security, availability, reliability, and workload management for engagement systems, and extending transactional systems through a BYOD or social model.

IBM’s enterprise systems (Enterprise Power Systems and zEnterprise) offer the unique set of capabilities today that support this vision of an integrated IT infrastructure running all of the critical enterprise applications – whether they are core transactional systems or customer-facing systems. They extend the efficiency, security, database, and workload optimization advantages of a shared IT infrastructure to consumer systems. With their sophisticated virtualisation and workload management, enterprise systems also offer user-centric, open, rapid deployment, and dynamic scale capabilities to help evolve systems of record.

How are you handling these business systems and their related IT infrastructure in your enterprise? How is your business ensuring that BYOD consumer systems are secure and reliable and that customer data is kept private? If you’ve solved this equation, connect with us via the enterprise systems web site or comment below and share your story!

 

This post by Doug Brown was originally featured on the Smarter Computing Blog (US). Doug is Vice President of Marketing for Smarter Computing, Power Systems and System z brands. Previously, Doug spent twelve years in IBM’s Software Group where he held several marketing leadership positions including global leadership for the Tivoli brand. You can reach him on Twitter @dougbrown700

 

 (162459) Blog Views

LinkedInTwitterFacebookShare

April 8th, 2013
4:51 pm

Posted by

SmarterComputingBlog
SmarterComputingBlog

Throughout March, IBM partnered with CIO Magazine to present a 5 city event series exploring the idea of preparing for the strategic and business challenges of tomorrow.

We were joined by innovative business leaders and industry analysts who see first-hand the challenges facing CIOs in an ever-changing IT environment. Revisit their presentations below and see how you can become tomorrow-ready.

 

Click here to access a range of Tomorrow-Ready CIO assets including whitepapers, videos and photos.

Ross Dawson on stage at Tomorrow-Ready CIO, Sydney. Source: CIO Magazine

 

 (11774) Blog Views

LinkedInTwitterFacebookShare

Posted by

SmarterComputingBlog
SmarterComputingBlog

More school systems across the U.S. are looking for ways to balance their commitment to providing a top-notch education with the pressure of keeping their buildings in tip-top shape. To achieve this, some schools are moving away from paper-based systems and putting all their data, from operational and maintenance information to real estate and resource data, online. Doing so, however, is creating a whole new set of issues as the schools are now left to deal with the management of “Big Data.”

Since it’s unreasonable to build brand new, energy-efficient buildings from the ground up, more school districts are looking within and starting to leverage and exploit the Big Data of building information. They’re starting to sift through critical data to make school structures more energy efficient and more cost-effective.

School districts from Portland, Oregon to Palm Beach, Florida are taking this approach. And with IBM’s help they’re finding highly profitable solutions that are helping to cut costs, save energy and enable schools to make smarter decisions on how school buildings are maintained and used.

In times of tight budgets, being able to do more with the data has huge benefits. For schools, this enables them to cost-effectively and easily maintain old buildings that consume enormous amounts of energy and continually demand unexpected expenditures for equipment repairs.

Here are just a few examples of schools that are putting Big Data to work in their buildings with positive results:

 

 

This post by Christopher Luongo, Writer/Strategist, IBM Communications, was originally featured on the Smarter Planet blog.

 (1029) Blog Views

LinkedInTwitterFacebookShare

March 20th, 2013
9:32 am

Posted by

SmarterComputingBlog
SmarterComputingBlog

Our Smarter Computing Radio podcast has showcased various IT issues and challenges faced by companies today.

Smarter Computing is a term that we all use, but how much do we know about it? What are the different factors that affect or make up Smarter Computing? What is Big Data? Is Big Data just another name for data analytics or does it represent more? How can virtualization affect your company operations or challenges? Is Smarter Data Centers the way to go?

Check out this podcast to hear more on the topic from IBM subject matter experts. Find answers and insights to the challenges faced by the IT industry today. It is a rich resource of updated, expert opinions and information as shared by top IBM experts.

Listen now!

 

 (233315) Blog Views

LinkedInTwitterFacebookShare

Posted by

SmarterComputingBlog
SmarterComputingBlog

From retailers and telecommunications companies to utility companies and healthcare providers, organisations need new ways to capitalise on the tremendous volume and variety of data that is available to them. These organisations realise that analysing this data will be critical to producing insights that help attract and retain customers, improve operational efficiency, thwart fraud, enhance product development, and more. Unfortunately, traditional enterprise data warehouse (EDW) architectures alone cannot accommodate the diversity of analytics that organizations need to perform—which is why they need a smarter approach to analytics.

This approach must enable them to conduct multiple types of analysis—including deep, complex analysis, tactical/operational analysis, and spatial analysis—and incorporate a wide variety of data types, from structured data to unstructured and streaming data. To maximise the value of analysis, organisations must be able to apply the right level of infrastructure performance to each analytics workload and deliver results at the right time for the right cost.

For most organisations, building homegrown solutions is not the answer. Attempting to acquire and integrate all the necessary components can be a costly, time-consuming, and error-prone process. Managing those homegrown solutions might require deep expertise. Organisations need integrated solutions that simplify deployment and speed the time-to-value while streamlining ongoing management.

The IBM big data platform is helping organisations address these challenges with solutions for capturing a wide variety of data and applying a full array of analytics. Two recent additions to the IBM big data platform can help organizations supplement traditional EDW architectures, expanding the types of analytics workloads that can be run while accelerating results. The IBM® PureData™ System for Analytics enables deep, complex analytics on large-scale data volumes while the IBM PureData System for Operational Analytics provides the near-real-time analytics needed for tactical decision making.

These PureData System offerings are part of the IBM PureSystems™ portfolio—a collection of expert integrated systems that provide pre-integrated solutions that draw on years of IBM expertise from thousands of successful implementations. Both PureData System solutions work with other solutions from the IBM big data platform to help organisations capitalise on emerging opportunities from big data.

 

Conducting deep analytics on large-scale data volumes

Building on the asymmetric massively parallel processing (AMPP) architecture and in-database analytics functions used for IBM Netezza® solutions, the IBM PureData System for Analytics is designed to deliver outstanding performance for conducting deep analysis on very large volumes of data. Organisations can examine billions of records and explore more variables, find more patterns, and deliver results faster than before. At the same time, the advanced I/O subsystem architecture enables the PureData System for Analytics to boost performance for mixed workloads, so organisations can conduct that deep analytics while also supporting shorter, more tactical queries.

Organisations are using this technology for their large-scale data volumes to enhance marketing capabilities, reduce churn in telecommunications, optimise digital advertising, advance medical research, and more.

 

Enhancing tactical analytics and delivering near-real-time insights

The IBM PureData System for Operational Analytics is designed for organisations that need to conduct tactical, near-real-time analytics and deliver results to thousands of concurrent users, from phone-based customer service agents to retail personnel at the point of sale. This PureData System takes advantage of new IBM DB2® 10 features that enable the continuous ingest of data and support thousands of queries per second. With the PureData System for Operational Analytics, organisations can incorporate business intelligence more directly into their processes and generate actionable insights at the moment when they will deliver the greatest impact.

One global credit card provider that handles 80 million transactions per day is using this technology to support real-time fraud assessment and customer care. A natural gas supplier is gaining real-time information on changing customer demand for natural gas.

 

Bolstering additional big data solutions

When used in conjunction with other solutions from the IBM big data platform, these PureData System solutions can expand analytics insights. For example, a utility company could analyze metering information streaming in from the grid using IBM InfoSphere® Streams to look for abnormalities that might signal network problems or indicate fraud. The company could then further analyse those tremendous data volumes with the PureData System for Analytics, using predictive analytics to anticipate fraud or using spatial analytics to locate potential network problems.

 

Expanding analytics capabilities

As organisations continue to realise the tremendous value of big data, they will need systems that can handle a greater volume and variety of data, conduct a broader range of analytics, and deliver results faster than ever before. The introduction of the IBM PureData System for Analytics and the PureData System for Operational Analytics demonstrates the ongoing development of solutions by IBM to meet the challenges and capitalise on the opportunities of big data.

For more information about the new IBM PureData System offerings, visit ibm.com/puredata.

 

 

This article by Greg Thomas and Nancy Kopp was originally featured in the IBM Data Magazine.(618) Blog Views

LinkedInTwitterFacebookShare

March 7th, 2013
1:39 pm

Posted by

SmarterComputingBlog
SmarterComputingBlog
This year’s Australian Open tournament once again provided a fantastic forum for the world’s leading players to demonstrate their sporting prowess.
.
Many athletes are now using analytics to unlock insights into both their own and their opponent’s performance, which has raised the competition to a new level. Plus with access to analytics-driven tools like IBM SlamTracker, the spectator experience has never been better.

As IBM’s Chief Operating Officer I observe these developments with a keen interest. What most people don’t realise is the ongoing advancements of the tournament are largely driven by IBM technologies.

Throughout the 20-year partnership with Tennis Australia, IBM has delivered more innovative and advanced technologies than could have possibly been imagined.

Many of the more recent highlights have been driven by IBM’s position of leadership with cloud computing. In fact, since the Australian Open website moved to the private cloud in 2008, Tennis Australia has seen a 35% reduction in the cost of each page view despite a 42% increase in page traffic.

Read the Australian Open Case Study here.

This post by Janet Matton, VP Operations IBM A/NZ, was originally featured on the Business Insight blog.

(684) Blog Views

LinkedInTwitterFacebookShare