Chris Sideroff is the owner and principal director at Applied CCM Canada (www.appliedccm.ca). Applied CCM Canada is the Canadian distributor for Pointwise, offers OpenFOAM-based development and support services, and is the primary developer of the OpenFOAM derivative, Caelus.
What do you see are the biggest challenges facing CFD in the next 3 years?
I think an obvious one is meshing – which means Pointwise is safe for a while. Seriously though, I think a big issue with meshing is a disconnect between what the mesher (software and/or user) creates and what the solver wants. I believe it’s deeper than what mesh adaption can do or has been promising to do (for a quite some time). Every numerical implementation of a CFD solver has made some assumptions – this is how we discretize the Navier-Stokes equations so they can be solved numerically. One or more of these assumptions always has a geometric consideration; hence the dependency on some characteristic of the mesh elements’ shapes.
As I see it there are two main issues. First, it tends to be very difficult, in a general sense, to identify exactly how much a mesh metric affects solution accuracy. The relationships among the assumptions are non-linear so rarely is there an opportunity to isolate and study the influence of one metric. Second, what, where and how these assumptions are made differs between CFD solvers. It’s kind of like if you don’t know the target you’re aiming for, how do you expect to hit the bulls-eye? In my opinion, it’s going to require much closer collaboration between those who develop meshing software and those who develop solver software. If we can reduce or eliminate the influence of the element shapes on solution accuracy, then there is some hope of significantly reducing the mesh generation burden. Then we can begin to think about things like automation and adaption.
The other big challenge is exploiting parallel architectures. Also let be me clear here and state that I don’t believe “the cloud” is our savior. There is more than enough hardware at our disposal even in a relatively low-cost desk-side workstation but we’re simply not using it. Traditionally we think of running CFD on the CPU core – one CFD process per core. But within modern workstation there are hundreds and sometimes thousands of unused processing units. I’m using the term unit here very generically. An obvious one you often hear about is general purpose computing on graphics cards, also known as GPGPU computing. There are a few others like multi-threaded cores, co-processors and ISA vector extensions. As an example, I was reading an article on HPCWire that mentioned the upcoming Power9 architecture from IBM would need 2,000 parallel operations at a time to keep a single core busy.
On the flip side, if you look at the www.Top500.org list of fastest supercomputers in the world you will find computers with more than 100,000 cores are commonplace and there are several with over a million. Combine that with what a single core can do and we’ll soon be talking about billion process simulations. Then the trillion cell challenge that Tecplot has posed will not seem so outrageous. In the CFD world, we have only touched the tip of the iceberg about understanding how to exploit all these resources. In this instance I feel the challenge is educating CFD engineers more about computer science so we can better utilize the resources already at our disposal.
John: As you know, I think NASA’s CFD Vision 2030 Study is a very comprehensive assessment of where CFD is today versus where we need it to be in the year 2030. Meshing and exascale computing were only two of the things they cited. How do you feel about the report overall? Did they get it right? Did they miss anything?
Chris: In the context of CFD for aerospace I think they got it right. I say in the context of aerospace because I’m sure CFD users in this field, particularly in the U.S. who have access to many of the excellent aero-centric CFD codes, may disagree with my point about the disconnect between mesh quality and solution accuracy. My argument was more general, in the context of CFD usage in more than one industry that use commercial or openly available solvers, where mesh quality can be a barrier to robustness and reliability.
In regard to my parallel computing comment, I think my point of the available parallel resources at the desktop is not addressed. One of their grand challenges was a large eddy simulation of a full-scale aircraft across the flight envelope. Clearly, that’s only going to be possible on the largest supercomputer available at the time, even by 2030. The overwhelming usage of CFD is still going to occur on the desktop so it seems to me even small improvements in the bigger portion of the “usage pie” would lead to more profound improvements in the applicability of CFD.
My general feeling is it was an excellent report that demonstrates leadership both from NASA as a leader in aerodynamics and for aerodynamics as a leader for CFD. While maybe it’s not applicable to all areas of CFD, it should prove to be a good reference point for other industries.
What are you currently working on?
We are working madly to complete our next release of Caelus, 5.10 (www.caelus-cml.com). Caelus is our fork of OpenFOAM. The first release was about bringing open-source CFD to Mac and Windows. The second release was about improving numerical aspects, in particular improving accuracy near walls and reducing sensitivity to mesh quality. We also added new infrastructure based on Python that will make it much easier to use and automate Caelus. The upcoming release, slated for this October, among other things will reintroduce of compressible solvers and turbulence models, the majority of which were written from scratch. OpenFOAM is not known for its compressible capabilities so we took some time to develop some new solvers that should improve that reputation.
![CAELUS-logo]()
Aside from that I am also quite busy working with our Pointwise customers. When I get questions from customers, I like diving in with their meshing projects, particularly if scripting is involved.
I’ve also been spending time trying to raise awareness of Applied CCM and Pointwise throughout Canada. One in particular is Pointwise’s #TutorialTuesday on Twitter, where Pointwise tips, tricks, and tutorials are posted every Tuesday. Through this, I’ve created a couple of videos demonstrating meshing a submarine hull geometry with both structured and unstructured techniques. I plan on using these meshes in some Caelus tutorials as well.
![sub-hull-struc-tut]()
Structured grid for a submarine hull
I am continually working on several projects involving Caelus and OpenFOAM. Applied CCM conducts a significant amount of internal research and development on new solvers, tools and methods. Things like our adjoint solver useful for shape optimization, our tangent solver useful for parameter optimization, and our high-speed transient solver that is significantly faster and more scalable than traditional transient solvers, are examples of technologies that have come out of our internal development projects.
John: OK, I have to ask. Why does the world need another variant of OpenFOAM? Where do you think Caelus fits into the OpenFOAM galaxy and the overall CFD universe? More directly, why would I want to use Caelus for my CFD versus something else?
Chris: Would it come as a shock to you that we’ve have been asked this question before? To which I usually reply with our technical reasons for doing so but perhaps I’ll take the opportunity to give some more philosophical reasons.
There is a general lack of cohesiveness among the development and use of OpenFOAM. To explain why I say this let me start with some context. For an open-source project to be successful, both in the terms of technical ability and social health, the process for making decisions must be transparent. It does not necessarily need to be democratic – simply transparent. You often hear the term “benevolent dictator” applied to leaders of many of the larger projects. Linus Torvalds of Linux and Guido van Rossum of Python are two good examples. There needs to exist a dynamic two-way dialog between the maintainer and the community. The maintainer makes decisions unilaterally but like any successful leader they do so while at least heeding their user’s advice and feedback. As I see it, this does not occur with OpenFOAM or if it does it is not obvious.
The community around OpenFOAM is vibrant and continually growing but there is little communication between the maintainers of OpenFOAM and the community. So there’s all these people doing what they love around OpenFOAM but because there’s no dialogue and legitimate way of contributing ideas back, there is a lot of fragmentation. Furthermore, because OpenFOAM provides no proof of verification, validation or algorithmic code review, the community is left to do so on their own leading to further fragmentation. In my opinion, this lack of cohesiveness is significantly hampering it. As you mention, there are other OpenFOAM projects but I have yet to see the cohesiveness I was speaking to like you see in other successful open-source projects. At least that’s how I see it.
We hope to change that model with Caelus. We try to have an open, two-way dialogue with users. For users that have developed their own extensions to Caelus, provided they’ve been satisfactorily tested and documented, we will happily add them to the distribution. We also do not require reassignment of copyright of the contributed code. Verification, validation, and code review is part of our Caelus development process now – and we are open about it. The plan is to make test cases the tutorials so that when a user looks to a tutorial as a guideline, those settings are relevant rather than only applicable to trivial cases. We started this already but will take some time to develop a significant body of test cases and documentation. As a community around Caelus grows, we may look to the community to help maintain certain portions of it but ultimately we will remain the benevolent dictators.
So that was the ideological thought process. Why would people want to use Caelus? First, it is built, tested and supported on Linux, Mac and Windows. It’s no secret the most prevalent desktop OS is Windows so those people were getting no love from OpenFOAM. Yes, there are a number of projects that have enabled OpenFOAM on Windows but many require the user to figure out how to compile it and again were not officially supported by OpenFOAM. The development environment on Windows is far different from on Unix-based operating systems so what compiles on Linux can be very difficult to compile on Windows. And most users probably don’t even want to bother compiling the software – as users they’re perfectly happy with binaries. This was our assumption when making this decision, now validated since the initial release of Caelus where Windows downloads outnumber the other two platforms 2 to 1.
OpenFOAM is known to be sensitive to mesh quality. There are schemes and settings available to counteract this – which often don’t help – but because the documentation and tutorials are so limiting, users are often immediately turned off. So unless you have an expert at your disposal, getting even converged solutions can be an exercise in futility. The second release of Caelus was a first step in remedying this. I’m not going to profess to you that all the issues can be fixed and will work on every case in one release but it’s a step in the right direction. There are a few other items that contribute to mesh sensitivity and it will take us a little longer to develop and test. I would also like to mention, that we release Caelus bi-annually – April and October – to keep things moving forward at a predictable pace. The version numbers reflect that. The first was 4.10 for Oct 2014, the second was 5.04 for last April, 5.10 for this October and so on.
John: Thanks for your candor and transparency, to use your word. I’m hoping this incites a dialog in the comments on this article.
How did you get to be where you are today?
I grew up in a really small town in northwestern Alberta, Canada (find Edmonton on a map and go northwest until you stop seeing civilization). Particularly, since I grew up before the internet got going, I wasn’t exposed to much outside of hockey. I do recall always having a love for airplanes and pretty much anything that flew. Plus, I was naturally good at math and science.
My path to getting into CFD wasn’t exactly direct. Out of high school, I started my freshman year of engineering in a local college that had a transfer program to the University of Alberta for subsequent years. That year didn’t go so well and I was asked to rethink my future as an engineer. The petroleum industry in Alberta at the time was booming so I got a job as a truck driver on a frack crew (yes, the so-called “evil” fracking you hear of in the news regularly). To make a long story shorter, after a couple of years as a truck driver, I had a rare moment of clarity where I decided that wasn’t the life for me anymore. I quit within a week. I also want to mention that being a truck driver is not a bad thing – I actually liked doing it, learned a ton and worked with a lot of great people – but it just didn’t jive with the expectations I had for myself.
I managed to get back into engineering school and eventually enrolled in the Mechanical Engineering program at the University of Alberta in Edmonton and graduated in 2000 with honors. During my senior year, as I was thinking about join the work force, I found myself a little disillusioned with the types of jobs available in Alberta. Nearly all my classmates were moving on to very prosperous careers in the petroleum industry but the aero-itch was still there for me. As fate would have it, during the final semester of my degree, the instructor in my compressible flow course, Dr. Jeff Yokota, suggested I consider grad school. Jeff was a totally unconventional professor – think long hair, many earrings, plaid unbuttoned shirt with a plain T-shirt, jeans and sneakers [John: Sounds like every math professor I had.] – but for whatever reason he resonated with me. Before becoming a professor, Jeff was a CFD researcher at NASA Glenn (then called Lewis) so he had lots of cool stories, knowledge, and experiences to share. So grad school with Jeff working on CFD was an obvious choice. I finished my MSc in 2003 which was about developing inverse design methods for axial turbomachinery cascades using CFD.
That work lead to me my PhD in Syracuse. While that’s what led me there, I ended up working on something completely different (applying CFD in the indoor environment). It still of course involved CFD but with commercial software instead of in-house developed codes. In retrospect, the opportunity to try many different approaches and tools was the best thing for me. Although there was no way I knew it at the time, probably the most influential aspect on how I got to where I am today was Gridgen. That was how you and I were introduced.
![Simulation of the personal micro-environment.]()
Simulation of the personal micro-environment.
John: Because I’m an Orangeman (Syracuse class of 1984), Pointwise has been providing my alma mater with free software for many years. And when I was on campus for an engineering career fair, that’s when you and I met. One thing led to another and you began work at Pointwise.
Who or what inspired you to get started in your career?
I hope no one I know takes offense to this but probably no one but myself. There are many who have helped me get where I am but I honestly feel my passion for science, math, computers and fluid dynamics is just part of my genetic makeup.
What advice do you have for young people entering the field today?
Here’s some advice I gave that helped a new graduate student at one of the local universities get a job in our field.
Be motivated to learn about and be good at many things. We all have our one or two passions but it’s important to have a broad skill set. Things like learning to program. Don’t learn one. Learn as many as you can. Low level languages like C and C++ and high level languages like Python and Tcl/Tk are good to know in our field. I should note, while Matlab is OK for simple analysis and data post-processing, you do not learn about the science of computing with it (hardware operation, memory management, IO, numerical methods (see below). That and it’s horrifically slow. Another one is the Linux operating system. It is very common in the CFD world, it’s free and easy to install.
Be interested in the fundamentals of numerical methods. Things like quadrature, interpolation, root finding, Lagrangian vs Eulerian, linear and non-linear systems form the foundation of the finite element method, finite volume method and others. By the way, having a thorough understanding of linear algebra (non-linear systems are converted to linear ones so it all ends up being linear) is a very desirable knowledge skill. Learn about the FEM, FVM and finite difference method, or even Lattice Boltzmann. These methods form the foundation of numerical analysis software such as ANSYS (FEM), Fluent (FVM) or PowerFLOW (LBM). You don’t see FDM much anymore but it’s good to be aware of it. If you really want to roll your sleeves up and get dirty, take a look at Caelus or OpenFOAM.
While it’s good to have experience with commercial CAE software, it’s important to have at least a cursory understanding of what’s inside them. You don’t just want to be a “crank turner.” Even if your passion is designing fluid systems, and the software is simply one tool of many, understanding how it works will make you much more productive and enable you to make better decisions about the results. I use the “garbage in, garbage out” analogy with CFD a lot. Just because it can produce pretty pictures doesn’t mean it’s correct. In fact, more times than not the results will be wrong. One has to work hard to get a “right” one.
I’ve seen what employers look for in candidates and they want someone that’s worth the money. Engineers with graduate degrees who work with CFD, typically expect higher salaries. That means employers commensurately demand more out of them and will be more cautious about who they hire. Because hiring people in these fields is a bigger investment, the risk to the company goes up. Clearly someone who can do many diverse tasks has more value and therefore the potential to reward the company’s investment goes up.
John: You realize I was able to graduate from Syracuse with an undergraduate degree in engineering without taking any linear algebra, right?
Chris: I’m surprised it wasn’t a required course.
John: Seriously, when it comes to hiring new college grads, my approach has been that skills can be taught; therefore, hire for aptitude. In other words, learn the fundamentals, learn how to learn, and you’ll be a strong candidate. Plus be a solid communicator.
Chris: I totally agree with that statement. The role of university is to learn how to learn. My advice about “skills to learn” are topics you should learn on your own. That’s the beauty of grad school – it offers the freedom and flexibility to develop skills while you learn. I am not sure it’s necessarily the role of the employer to teach those skills but having any of these sure looks good to an employer. Let’s not even open a discussion about the generally abhorrent writing and communication skills most engineers have.
How do you know Pointwise?
The company/people: I used to work there. The software: I used Gridgen in graduate school.
John: I think it’s important to make readers aware that you left because you had an entrepreneurial itch you wanted to scratch and a desire to do more programming. It would’ve been great if you were still here.
Chris: That’s right. I had a lot of fun working at Pointwise and have a lot of respect for the people and company.
Can you share with us your favorite tools and resources that help you get your job done?
Linux and in particular a Bash shell: Honestly, I don’t know how people develop software without it. By the way, you’re a lunatic if you’re using a C shell. Korn shell and zsh are OK though.
Stack Overflow: Honestly, I don’t know how people develop software without it or even before the internet.
Pointwise: Honestly, I don’t know how people make meshes without it.
Vim: Honestly … I need not say more.
Everything else I have an open mind about.
If we were to visit you, where’s a good place to go out for dinner?
I haven’t fully explored Ottawa’s eatery scene but there’s an excellent vegetarian restaurant called the Green Door on Main St near St. Joseph’s College. I’ve taken the most hardened carnivores there and they’ve all given it the thumbs up, including myself. Ottawa has an almost ridiculous number of Shawarma restaurants so if Mediterranean food blows your hair back you’d be in heaven here.
John: I was hoping for a poutine recommendation.
Chris: Right, I forgot about your affinity for it. So far the best poutine I’ve had are at what they call here “chip shacks” –your typical little mom-and-pop burger joint. My favorite so far is Casse-Croute Limbour on Rue St. Louise in Gatineau, QC along the Gatineau River near the Chelsea dams.
John: OK, I guess veggies it is the next time I come to visit. Thanks for taking the time for all these questions.