Friday, May 13, 2011

Will Analytics Drag O.R. Back to Its Roots?

The INFORMS blog challenge for May is "O.R. and Analytics", so here goes ...

There are about as many definitions of "analytics" as there are explanations for the recent global economic crash. My take on "analytics" has been that, at least until now, what it meant to business executives was "take this huge pile of data and make sense of it". Operations research models in the real world have always relied on some level of data analysis, because models contain parameters and parameters must be estimated. (Outside the real world -- which is to say in academe and government service -- modelers get to make up whatever parameter values they like.) That said, I've never really associated heavy duty data analysis with O.R. Statisticians analyze data and data miners try to read sheep entrails (according to the statisticians); O.R. analysts build and use models.

As the term "analytics" grabs mind share among executives, though, O.R. societies and O.R. practitioners are trying, to borrow a phrase from Microsoft, to "embrace and extend" it. The societies see this as a way to boost membership and conference attendance, and both the societies and practitioners see it as a way to enhance the visibility of O.R. to the people who will (hopefully) employ O.R. analysts, directly or as consultants. I would not be surprised if the data analysis crowd see this as O.R. people trying to share the spotlight uninvited. Fortunately, since their forte is recognizing patterns as opposed to prescribing solutions, they'll see us coming but probably won't be able to keep us out.

Extending the O.R. brand will require more than just saying "operations research ... including data analysis" or "operations research is analytics ... plus so much more". If we're serious about this, we'll need to reconsider what we mean by "operations research", and perhaps how we go about its practice. Therein lies an opportunity to return to O.R.'s roots.

The history of O.R. traces back to World War II, and the stories from that era have a common thread. Someone has a problem. The problem appears somewhat quantitative in nature (or at least amenable to measurement and quantification). We want to model it, and see what the model suggests might solve the problem. Absent from this description is any pigeon-holing of the problem as a linear program, or a queueing problem, or a discrete event simulation. One of the classic examples was finding ways to move men and materiel from the U.S. to the U.K. more safely during the Battle of the Atlantic. The answer involved determining the optimal shape of a convey (which was not a mathematical programming problem, the word "optimal" notwithstanding), recognizing that small convoys of fast ships (small possibly meaning a single ship) might not need an escort (they were too fast for the U-boats to pick on), and so forth.

As O.R. bloomed after the war, we developed more and better tools (theory, algorithms, software) ... and along the way, we became more specialized. So now we have experts in optimization and experts in simulation and so on, and we tend to adhere to the maxim (which I will shamelessly bastardize) that if you're really good with a screwdriver, everything looks like a screw. At a recent INFORMS conference, I attended a session about teaching modeling to MBAs. Given the topic, I suspect most of the attendees were fellow academics, so I apologize if I offend any practitioners by tarring them with the same brush. At one point in the session, the presenters posed a scenario to us (the Red Cross is thinking about paying for blood donations, and wants some consulting on it), and, with not much more detail than what I just gave, turned us loose on it. The optimizers started writing optimization models. The simulators started sketching out simulation models. If there were any decision analysts in the room, I'm sure they were drawing decision trees. In other words, most of us framed the problem in terms of our favorite tools, rather than fleshing out the problem (which was quite vague) and then looking for tools (possibly mathematically unsophisticated tools) that would match it.

As we try to figure out how data mining and "business intelligence" (note that I'm skipping all oxymoron jokes here, a severe exercise in restraint) fit with O.R., perhaps we can seize the opportunity to starting conceptualizing (and teaching) O.R. as first and foremost sorting out and describing what the problem actually is. My very limited understanding of data mining suggests that it leans more toward letting the data tell you what it wants to tell you than to making the data fit a preconceived model; extend that to the overall problem, rather than just the data, and we're back to something I think the progenitors of O.R. would recognize.

9 comments:

  1. ".. most of us framed the problem in terms of our favorite tools, rather than fleshing out the problem .. "
    - precisely what i felt at a recent INFORMS conference too.

    ReplyDelete
  2. Paul, Excellent post ! I could not agree more on the 20 last lines in particular.

    ReplyDelete
  3. A case-based approach might fit such an expanded definition of OR (i.e., one that included Analytics) better than the current technique-based curricula. Further, if we wish to become relevant to the world of business, the focus of OR education should move to the Master's level ("Master of Business Analytics", anyone?), as is the case with management education.

    PhD-focused OR education is currently an undisguised faculty replication scheme. It certainly isn't about creating a corps of analytic experts. (It strikes me that what used to be the EES department at Stanford had a maverick real-world orientation. Alas, it is no more.)

    ReplyDelete
  4. @Sanjay: I agree in part, but I think there's a need for a balance between masters and doctoral programs. Masters programs should emphasize practice; doctoral programs need to emphasize research. The tricky part is that faculty need to know practice as well as theory, since they'll do the bulk of the teaching of the masters students.

    ReplyDelete
  5. @Sanjay, a Masters of Business Analytics degree sounds like a good, but very interdisciplinary idea to me. It would need content in Statistics, CS (databases, programming, machine learning), OR, and Business per se. Almost sounds more like a program of a Business school than an Engineering school. Too bad the MBA acronym is taken! MBQA? Masters of Business Quantitative Analytics? There certainly are MBAs with relatively strong quantitative backgrounds, but I think they tend to manage analytic projects, not implement them, no?

    ReplyDelete
  6. MSBA (Master of Science in Business Analytics)?

    ReplyDelete
  7. Thanks for this great post, Paul. I am still trying to make sense of the rise of Analytics. While I like the recent moves made by INFORMS to try to bring the Analytics folks under the OR umbrella, it will require "extending the O.R. brand." That is not so trivial, and as a non-marketing person, I don't have the answers. Your post has initiated an important conversation, one that we hopefully continue to have in the O.R. discipline.

    ReplyDelete
  8. Thanks for the kind words, Laura. I'm one of the people least attuned to marketing on the planet, so the brand extension is largely a mystery to me. I just hope this doesn't end up with data mining and forecasting just being a couple more tools in the OR tool box.

    ReplyDelete
  9. I enjoyed how your text summarizes the doubts about the analytics buzzword.

    Indeed, it is very important that the academia warn about business misconceptions, as it is that researches of specific topics understand that they will not be able to solve alone all the problems they face.

    ReplyDelete

Due to intermittent spamming, comments are being moderated. If this is your first time commenting on the blog, please read the Ground Rules for Comments. In particular, if you want to ask an operations research-related question not relevant to this post, consider asking it on Operations Research Stack Exchange.