• Jim Brown joins Autodesk's Michael Vesperman during this Engineering Live panel discussion moderated by Janine Mooney, Editor in Chief at Advantage Business Media. Jim and Michael share insights from recent survey results and customer experience in an interactive discussion on NPDI.  They share NPD best practices and technology including 7 things that Top Performers in NPDI do differently than the rest. Register for the May 17 event now! … [ read more ]

    Beating the Competition with New Product Development and Introduction (webcast)
  • On Thursday, April 27 at 2:00 EDT, Michelle Boucher will share tips to improve the efficiency of simulation preprocessing.  During this webinar,  Shawn Wasserman, simulation editor at ENGINEERING.com, will join Michelle. This webinar will reveal: The top improvement areas you should focus on to get even more value from simulation Common challenges during the preprocessing phase Best practices to overcome those preprocessing challenges Case studies demonstrating the benefits of efficient preprocessing   Register for this live event.   … [ read more ]

    Webinar: Solving the Bottlenecks of Simulation Preprocessing
  • The Design Data Management Maturity Improves Profitability, Analyzing Best Practices for Managing Designs report shares fresh survey data and interviews. It updates our design data management research to cover new trends like using cloud file sharing applications to manage data. It also drills down on the topics of complexity and non-value added time spent managing design data, and shares metrics on the business improvements available from design data management best practices and technologies. Please enjoy the summary below, or click the report or title to download the full PDF (free of charge, no … [ read more ]

    Design Data Management Maturity Improves Profitability (survey report)
  • How should you improve access to service and parts information? Please share your experience, thoughts, and lessons learned  in this new survey on managing product service and parts information. The survey explores ways to streamline parts information for service technicians and the associated benefits. We will explore questions such as: What are the biggest challenges you should focus on to improve the efficiency of service visits? What type of information would service technicians find most useful ? How can you improve the accuracy of service manuals? We would like to learn what you find hard and … [ read more ]

    Managing Product Service and Parts Information (Survey Invitation)
  • Tech-Clarity's Medical Devices Manufacturers Software Selection Guide helps manufacturers identify the right buying criteria for software solutions to support developing, producing, and servicing medical devices. This buyer's guide also takes into account unique needs for medical device companies including regulatory compliance and support for the FDA's Case for Quality. Tech-Clarity’s Buyer’s Guides go beyond software functionality to provide a framework of requirements that impact implementation success and long-term ROI, including: Software capabilities Implementation User adoption … [ read more ]

    Medical Devices Manufacturers Software Selection Guide

PLM on the Cloud – Tempest or Simply Vapor?

Share

What I learned this week … is a reaction to some of the buzz coming out of SolidWorks World this year. I am not attending the event, but there has been a lot of good reporting from SolidSmack, Ray Kurland, Derrek Cooper,  and others. The word of Day 1, it seems, was “Cloud.” So much in fact that @rtara (Roopinder Tara) suggested on Twitter a new drinking where there are shots taken every time somebody said the word. So now “PLM” and “Cloud” are official buzz. I have not spent much time on this, so I thought I would use this post as a starting point. Is this a brave new world, or just another buzzword to throw around?

The Buzz

Don’t get me wrong, buzz is not bad. As long as their is beef behind the buzz. OK, not that I have hopelessly mixed my metaphors, let’s do some definition.

  • Cloud (from Wikipedia, where else?) – Cloud Computing is Internet– (“cloud-“) based development and use of computer technology.  In concept, it is a paradigm shift whereby details are abstracted from the users who no longer have need of, expertise in, or control over the technology infrastructure “in the cloud” that supports them. Cloud computing describes a new supplement, consumption and delivery model for IT services based on the Internet, and it typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet.
  • Cloud (greatly simplified by me) – Is your data and your applications out on the Internet.

It’s not as mysterious as it sounds, data and applications aren’t randomly dispersed, they are just outside of your organization with service provider(s). Any online application can be considered “Cloud Computing.” Amazon is in the cloud. What’s new is that we are talking enterprise applications in the cloud. And even that isn’t so new, salesforce.com is CRM in the cloud, in a “software as a service” or “SaaS” mode. Seems like we could demystify this a bit, no?

So why is there so much buzz? Wouldn’t it be nice to move all of your IT problems outside to someone else? Why wouldn’t you want someone else to worry about capacity, bandwidth, security, OS upgrades, hardware upgrades, etc.? Oleg has also written a lot about the cloud in PLM Twine.

Buzz Kill

So why wouldn’t the cloud make sense? It lowers costs, simplifies infrastructure, and generally relieves a lot of IT burden from the enterprise. I will offer two perspectives:

  • Users Want Performance – For the most part, users don’t care where there data is. But they want to be able to get to there information when they need it, and they want it to be available rapidly. If data (and applications) move to the Internet they will need to maintain an acceptable level of performance. For example, I have two e-mail accounts. One I control on my client, the other is “in the cloud.” When my PC decides to download a new virus definition my Internet-based e-mail slows down. Not just downloading/uploading from my client (which is a bit of a pain), but actually typing an e-mail. The same is true as I am writing this blog post. If my bandwidth goes down or my laptop starts hogging resources for something else, I lose my focus on what I am writing and have to focus on getting the words on the page. If I get distracted by performance issues when trying to write a post or an e-mail, how distracting would it be to an engineer trying to solve a design problem?
  • Corporations Need Control – Of course we would like to move our IT problems to someone else. But who can we trust? Who will we work with if the performance our users demand isn’t there? Is it the application service provider’s issue, our Internet Service Provider’s (ISP) problem, our network? How do we guarantee our precious product data is safe? Can we trust our service provider’s employees? Is there more inherent risk when my data and my competitors’ data is on the same infrastructure? Facebook made a mistake and gave access to profile data to the wrong people. Oops. What if that was your CAD data? These are just a few of the questions.

None of the issues are insurmountable. We are moving this direction. It has been an evolution, but more and more of what we do is happening off or our personal machines and outside of our firewall. I do think that trend will continue for cost and simplicity purposes. But the infrastructure and business models for the cloud are just developing for areas like PLM. Stay tuned.

Implications for Manufacturers

The implications from my perspective are clear:

  • The Cloud is Compelling
  • The Cloud Must Perform
  • The Cloud Must be Able to Answer some Serious Corporate IT Questions

Look for evolution when it comes to moving PLM to the cloud, not revolution.

So those are some early thoughts on PLM and the Cloud, I hope you found it interesting. This will be an interesting evolution to see unfold. There will be bumps in the road (maybe some big ones), but the benefits are compelling. What do you think? Are you ready to give it a try? Have you?

SPEAK YOUR MIND

  1. Compelling, but deja vu. Those mainframes of yesteryear might was well been in a cloud they were so obtuse. Now some analysts are talking about in-house clouds. Must be getting payola from IT departments.

    Performance/IT Questions – Isn’t it interesting that when it is a service, people want service level agreements (SLAs)? Do ANY IT departments promise their internal customers ANYTHING for on-premise software?

    The other interesting thing about this with respect to PLM is to compare it with CRM like Salesforce.com. With PLM, people question if people will put their intellectual assets on the cloud, where they could be ripe for the picking. (If the recent articles are correct, it will all end up in China.) BUT, what is more valuable than a companies sales pipeline and customer information, which people will readily cede to Saleforce.com?

    Just sayin’…

    Stan

  2. In his comment I think Stan hits on a couple of key points. The most important and one that I find Datastay discussing with prospects is the issue pertaining to putting IP on the cloud. The truth of the matter is that the security doesn’t come from having the box that holds your information sitting in the room next to you. It comes from having it securely vaulted and managed by the people and systems that were designed specifically to do so. Ensuring that there is one primary access point to your data helps control what goes in and what comes out. Sure your IP can still end up getting in the wrong hands, but it’s not because they will somehow gain access to your data through some magical entry point to your system. Truth is information is stolen by employees, and usually your own – so where they access it from really isn’t the root cause of the problem.

  3. I was at the presentation and was surprised about how much build up SW gave this. Oracle, PTC and Arena have had on demand solutions out for a while now. My personal experience with On demand for PLM has been mixed. Like Stan I think the security objection is suspect but it does come up a lot. Ironically most companies data would be safer on the cloud than it is at their own facilitites. You really hit on the big issue, performance. Not sure if SW has some magic dust or something that will make the internet faster but based on my experience with PLM on demand it has a ways to go before it comes close to on premise much less surpassing it.

  4. Stan,
    Good point about the mainframe being obtuse. My point about the cloud is that the data is outside of the business. But “cloud” makes it sounds like it is randomly spread across any available server. The truth is that “the cloud” when it comes to virutalizing servers and getting the most out of infrastructure is new. What we are really talking about here is not the technical deployment of servers, but how comfortable are companies outsourcing their apps and data.

    Michael points out something I have heard (and believe) but I can’t verify. Most data loss comes from the inside. Physical access to machines is a big key to security. So if a company is going to trust another company to protect their iformation, they must have visibility to security procedures, employee screening, etc. And from my experience, an outsourcing provider is often more careful (and more capable) in regards to security than most inhouse IT groups.

    Thanks,
    Jim

  5. Steve,
    As far as performance is concerned, there are controllable and uncontrollable factors. When things are inhouse, they are (in theory) more controlled. I am curious to see how much bandwidth (and bandwidth latency) issues play as more applications move to the web. One of the reasons we all loved our first PCs so much was that they were lightening fast compared to the mainframes. We need to replicate lightening fast on “the cloud” – no pun intended. Even more interesting is that SolidWorks (and Dassault Systemes in prior conversations) is talking about moving design applications / CAD to the “cloud.” I am as curious about how well they can rapidly track user input as I am about rapid rendering to provide the instantaneous feedback designers want. They want pencil-and-paper fast. Not send-and-receive fast.

    I am not saying this won’t be done. But I would be cautious about this and do some serious testing if it was my business.

    Thanks for the comment,
    Jim

  6. Funny side note – my browser crashed while I was writing the post, and I lost data. That kind of thing just happens. How tolerant will companies be? How tolerant will their engineers be?

  7. Jim.. good stuff here. I am surprised at the reaction to the “cloud”. People seemed to be consumed with the need to have a clear definition of it. Do people not really get it or do they prefer just beating a dead horse? Clearly the “cloud” refers to web-based access to software/data. Some of it could be outsourced, could be “in-house” servers or a combination of all of the above. Performance and security are absolutely two things that will just have to be figured out. As far as PTC, arena etc having these capabilities- sure they have some of the data capabilities, but what SW was showing is way beyond what most companies have ponied up to show. That is – they showed a possible future of MCAD.

    Personally, I can’t wait for the “cloud” to become a reality. I use salesforce, google docs, dropbox, jing, screencast.com, ubuntu one everyday and can’t wait for other aspects of software to join the ranks.

    The most exciting thing for me is the simulation community. We have as much to benefit from the cloud than anyone. But the challenges are huge. Not only do we need access to software on-demand, our computing requirements are way more than the avg user. So, we can’t simply hook up with am amazon or google service as the hardware requirements for us are way more. BUT- it will be a reality in the near future, guarantee it.

    • Derrek,
      Thanks. It’s funny to think of the simulation community in regards to the clould. Haven’t you guys been pushing work off to servers wherever you can find capacity for years? Some of the biggest / baddest machines (and clusters of machines) I have seen are for simulation and analysis work. If you can couple that capacity/capability with real-time interaction over the Internet you could really change the pace at which designers get feedback on their work.

      Thanks for your thoughts,
      Jim

Speak Your Mind

*