Putting the sexy back into BPM since 2008.
This is a blog devoted to Business Process Management Virtual Environments, a new research theme within the Business Process Management Research Cluster at the Queensland University of Technology.
BPMVE is an investigation into using the latest interactive entertainment technologies for visualising and interacting with Business Process Models for the benefit of all stakeholders.
We have just recently published a new journal paper in the Information Systems Journal on our virtual worlds elicitation work in BPM - "Augmenting process elicitation with visual priming: An empirical exploration of user behaviour and modelling outcomes."
QUT Eprints is here, with Journal DOI. The work was performed by my PhD student Joel Harman along with my collaborators at QUT, Metasonic and University of Vienna.
Well done Joel!
Business process models have become an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach to process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions similarly to how they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. An empirical investigation comparing both the modelling outputs and participant behaviour of this virtual world role-play elicitor with an S-BPM process modelling tool found that while the modelling approaches of the two groups varied greatly, the virtual world elicitor may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
In conjunction with the 14th International Conference on Business Process Management, BPM2016
Call for Papers
Visualizations can make the structure and dependencies between elements in processes accessible in order to support users who need to analyze process models and their instances.
However, effectively visualizing processes in a user-friendly way is often a big challenge, especially for complex process models which can consist of hundreds of process components (e.g., process activities, data flows, and resources) and thousands of running process instances in different execution states.
Many challenges remain to be addressed within the broad area of process visualization, human interaction and user led design such as: scalability, human-computer interaction, cognitive aspects, applicability of different approaches, collaboration, process evolution, run-time requirements of process instances and applications, user-engagement etc.
Topics of interest include (but are not limited to):
Visual Metaphors in Processes
Visual Design and Aesthetics for Processes
Visualization of Dynamic Data in Processes
Change Visualization for Processes
Interface and Interaction Techniques for Process Visualization
Visualization Techniques for Collaboration and Distributed Processes
Visualization of Large-scale Processes
Cognition and Perception in Process Visualization
Evaluation and User Studies of Process Visualization
Evaluation Methods for Human Aspects in PAIS
Visual Modeling Languages
Analysis Techniques and Visualization for Processes
Process Visualization of Large Screens
Mobile Process Visualization
Visualization Tools and Systems for Processes
Visualization Techniques for Processes
Process Visualization and Sonification
Virtual World Process Visualization
Immersive Process Modeling Approaches
Human Computer Interaction Design Applied to Process Systems
3D Process Visualization Approaches
Human-centric aspects in business process management
User-centered design for BPM
User Interface design for Processes
Format of the Workshop
The half day workshop will comprise accepted papers and tool evaluations. Papers should be submitted in advance and will be reviewed by at least three members of the program committee.
This year will also include a new innovation in the programme. Part of the workshop time (depending on the number of prototype submissions) will be set aside for focus group assessments of tools. We will be requesting tool report authors, successful workshop paper authors and panel members attending BPM, to assist in the assessment of demonstration visualization techniques and software. This evaluation process will be a service to attendees, as these heuristic assessments can be written up later as separate papers, or by the workshop chairs as an aggregated workshop outcome. Such evaluations will be an exciting addition to the workshop, as people experienced in Information Visualization, BPM, HCI and related fields, will provide detailed feedback on your prototypes. The evaluation approach is largely in the hands of the tool report writers, but at a minimum, should involve direct interaction with your software and some form of validation via a questionnaire.
All accepted papers will appear in the workshop proceedings published by Springer in the Lecture Notes in Business Information Processing (LNBIP) series. There will be a single LNBIP volume dedicated to the proceedings of all BPM workshops. As this volume will appear after the conference, there will be informal proceedings during the workshop. At least one author for each accepted paper should register for the workshop and present the paper.
Deadline for workshop paper submissions: 27 May 2016
Notification of Acceptance: 27 June 2016
Camera‐ready version: 18 July 2016
TAProViz Workshop: 19 September 2016
Prospective authors are invited to submit papers for presentation in any of the areas listed above.
Three types of submissions are possible:
(1) full papers (12 pages long) reporting mature research results
(2) position papers reporting research that may be in preliminary stage that has not yet been evaluated
(3) tool reports, to be evaluated at the workshop
Position papers and tool reports should be no longer than 6 pages. Tool reports should include a brief evaluation plan as an appendix, for the evaluation session at the workshop on the day.
Only papers in English will be accepted and must present original research contributions not concurrently submitted elsewhere. Papers should be submitted in the « LNBIP http://www.springer.com/computer/lncs?SGWID=0-164-6-791344-0» format. The title page must contain a short abstract, a classification of the topics covered, preferably using the list of topics above, and an indication of the submission category (regular paper/position paper/tool report).
All accepted workshop papers will be published by Springer as a post-workshop proceedings volume in the series Lecture Notes in Business Information Processing (LNBIP). Hard copies of these proceedings will be shipped to all registered participants approximately four months after the workshops, while preliminary proceedings will be distributed during the workshop.
Submitted papers will be evaluated, in a double blind manner, on the basis of significance, originality, technical quality, and exposition. Papers should clearly establish their research contribution and the relation to the theory and application of process visualization.
Thematic analysis is a poorly demarcated, rarely acknowledged, yet widely used qualitative analytic method within psychology. In this paper, we argue that it offers an accessible and theoretically flexible approach to analysing qualitative data. We outline what thematic analysis is, locating it in relation to other qualitative analytic methods that search for themes or patterns, and in relation to different epistemological and ontological positions. We then provide clear guidelines to those wanting to start thematic analysis, or conduct it in a more deliberate and rigorous way, and consider potential pitfalls in conducting thematic analysis. Finally, we outline the disadvantages and advantages of thematic analysis. We conclude by advocating thematic analysis as a useful and flexible method for qualitative research in and beyond psychology.
Useful paper to plug the gap of the absence of an introductory paper on thematic analysis for non-qualitative researchers. While I do have problems with this as a technique in contrast to quantitative methods that I have usually used in either software engineering, computer graphics or otherwise, it does enable contextualisation of the thoughts of participants in user studies.
Reading starts here:
Seems to be a lack of consensus on the placement of thematic analysis, is or is not a specific method. The authors argue it is a specific method.
They argue that it is to be well defined in this paper, without restricting its flexibility.
Corpus refers to all data, while data set is the data to be used in analysis.
Key Point - "Thematic analysis is a method for identifying, analysing and reporting patterns (themes) within data." Obvious, but it has to be stated.
They claim that it is a method of analysis, and that other named approaches are essentially still a form of thematic analysis.
They comment that themes should not emerge; this is a passive approach, denying the place of the researcher. They say that themes reside in our head from our thinking about the data. ME: I wonder if it is a product of an interaction with the data, which probably puts me into a social constructivist position. Does the data turn into an entity in my phenomenological experience. No, that is too silly. But, one could imagine that I am giving the data some agency in some way. Something to think about...
"What counts as a theme? A theme captures something important about the data in relation to the research question, and represents some level of patterned response or meaning within the data set."
They suggest there is no hard percentage of data that establishes a theme; ME: can this get any more vague? This suggests no standard computational component to thematic analysis. Though they suggest prevalence as a heuristic approach. I guess one instance is disproving evidence of the absence of that instance in the rest of the cohort.
"Part of the flexibility of thematic analysis
is that it allows you to determine themes
(and prevalence) in a number of ways. What
is important is that you are consistent in
how you do this within any particular
ME: I am struggling here. How do you take a measurement without a clear comparable metric that generalises to other cohorts? While I get the primacy of consistency, one can be consistently wrong as well.
They then compare and contrast the top down theoretical approaches to bottom up inductive processes. Inductive is considered richer, top down is more detailed analysis of a particular theme.
Semantic or latent themes. Semantic is more of an analysis of the themes present in the data prima facie, while latent analysis looks for underlying ideas from the themes; thus is more interpretive. They also note it is more constructivist, but is not necessarily completely constructivist. ME: man, my poor quantitative brain is struggling a little here. :-) while I get that phenomenological outcomes are going to be inexact and dynamic, I am finding this a little hard going.
"Those approaches which consider specific aspects, latent themes and are constructionist tend to often cluster together, while those that consider meanings across the whole data set, semantic themes, and are realist, often cluster together."
There are 6 guidelines (not rules, of course! :-)) With some comments or extracts from their descriptions of these phases.
1. Familiarizing yourself with your data:
2. Generating initial codes:
3. Searching for themes:
4. Reviewing themes:
5. Defining and naming themes:
6. Producing the report:
They recommend re-reading data. Mark down codes now, as part of an interpretive process. Make sure not to process transcriptions, as punctuation can matter at this stage. Interestingly they suggest to personally transcribe (my IS school hires people) in order to develop deep reading skills. Good point.
"remember that you can code individual extracts of data in as many different ‘themes’ as they fit into / so an extract may be uncoded, coded once, or coded many times, as relevant." ME: This seems to be an m-to-m mapping for themes. I would need an example, but if many themes use elements that reappear, then that must be problematic for identification of themes as entities. Just seems too incoherent.
Key point, do not throw out any themes, just keep them in the basket for later.
Make sure your themes fit all the data; which to me is almost impossible, but may work as an inexact process.
Stop refining when there is no new contribution to the themes. ME: Almost a embedded grounded theory exercise.
The key point to take home is the need for the themes to be an essential description of the data, with a punchy and clear title for communicating the ideas.
The report provides sufficient evidence of the presence of the themes within the data set analysed.
Needs to go beyond description to make an argument about the data analysed. ME: Key point here.
Lack of actual analysis - obvious, but I have seen it a lot in tool evaluations. Maybe because the response to the tool is not expected to be that complex?
"The third is a weak or unconvincing analysis, where the themes do not appear to work, where there is too much overlap between themes, or where the themes are not internally coherent and consistent."
"‘anecdotalism’ in qualitative research / where one or a few instances of a phenomenon are reified into a pattern or theme, when it or they are actually idiosyncratic."
Consider alternative thematic explanations in the analysis, to show deep insight into the data and its context.
"One of the criticisms of qualitative research from those outside the field is the perception that ‘anything goes’." ME: The authors have insight! :-)
As can be seen by my comments, I have some skepticism about the results of such thematic analysis. However, I do give it validity for its ability to contextualise quantitative results. I aim to do that with more rigour in the future in my research, as my work usually involves software tool analysis.
Workshop paper and related ITS poster has been uploaded to eprints. Will be presented on the 15th Nov. at ITS 2015 and related CMIS workshop. Is part of a collaboration with University of Bochum, supported by IFE @ QUT.
Last Wednesday night we had our BGIE Industry Showcase in P Block @ QUT. Great night, with heaps of gameplay. But, even if you missed the event, you can play the games! All the links to the student projects follow. So knock yourself out and play my students' games. You won't be disappointed. :-)