C++Now 2017: Daniel Pfeifer “Effective CMake"

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
my name is Daniel - I will talk about cineq my motivation for this talk here why why talk to you about cineq is because here I have a large audience of library developers and the important takeaway for you library developers is the way you use see make and that is including not using see make at all affect your users so very often for example I want to include a library in my project and I see ok this library is even built with CMake so it should be easy to include it into my project and sadly that's not the case so I will give you some some guidelines how you can how you can make that better so it's easier to use your code recently when you follow the status of build systems we see there from time to time that's coming up and use the new build system and you ask why why C++ at all and very interestingly there's some some similarities with C++ so both see make and see close place they have a large user base I would say they dominate the industry they also both have a very focus very strong focus on backwards compatibility so we try not to not to break anyone's code by adding or removing new features they have both are very complex and feature-rich C++ is called a multi paradigm language and I would also say that C make is a multi-paradigm toolset so it's not only scenic that you can use to configure a project but it comes with a collection of other tools for different use cases both have somewhat bad reputation I've had heard the terms bloated and horrible syntax referring to both C make and C++ and I have some not very well known features you know hope to show you some today my taik about when when someone tries to do introduce a new build system and makes it so much better this is what I think this will lead to and even if c++ if c make is not perfect it would be better if you improve c make and follow the pragmatic c make way instead of trying to replace it from time to time in my talk I will have such slides with a dark background this is always when I want to give you guidelines this is the stuff that will be will be asking the exam afterwards this one is quite generic um just to serve as an example I want to say you see makers code that means what what that means is everybody should have his own interpretation for that so whatever is important to you in your C++ code base like don't repeat yourself or or whatever you should apply the same principles also in the configuration of your project ok let's talk a little bit about the ugly syntax of seem a civic is organized like this we have directories directories are the entry point for generating a build system those contain the file called cmakelists dot txt and you can add sub directories with the except directory commands and those sub directories then also need to have a C make list file inside then we have scripts scripts are executed with CMake - P it would even be possible to have put a hash bang in a Sameach script and and setting the executable flag so you can execute Sameach scripts on the command line in Sameach scripts not all CNA commands are supported things like as projects and things like that executable are only allowed in in projects and then we have modules modules RC make files located in the sea mate module path variable so when you use the include commands to include a module either in a sea make project or in a sea make script this will then include the module and of course it's only possible if that module also just uses the commands that are allowed in in a script if you include it from a script commands and sea make look like the sea make files look like this you have commands and then a space separated list of strings so each variable each identifier is a is a string in semuc scripting commands they change the state of the command precess processor for example you can set variables but there's also ways to affect the behavior of other commands in project commands you can create build targets and change builds targets and important command invocations are not expressions in the cinetic language so you cannot put a command invocation directly as an argument of another command or inside an if condition that's not possible in c make we have variables they are set using the set commands and then expanded using the dollar and curly brace as I said all variables are strings even lists they just have the convention that they are separated by semicolon C make variables are not environment variables that's the mistake some people say they think okay why is it not defined I've said it here now C make variables are separate from the environment and if a variable is expanded using the dollar curly brace and this was previously not set it expands to an empty string so that's the source of problems and therefore it's often advocated to avoid variables comments in civic there are two forms the one is probably widely known the single-line comments the multi-line comments are not widely known at least it's not known by the syntax highlighting engine of the slides here multiplying comments they start with the hash and then brackets then any number of equal signs another bracket and they're closed by closing brackets the same number of equal signs and the closing bracket and therefore they can be nested and it's also very interesting if we add an additional hash here in this line then the design 3 will turn into a single line comment all the things becomes enabled and this is also a single line comment okay then we have generator expressions in this email language this is introduced with a dollar angle bracket syntax and those are evaluated during project generation so later in the project generation step those variables are are expanded so in script there they're just strings and since they are evaluated during project generation they are not supported in all places only in places where you really modify build targets it's obvious that it's not supported inside and if because the if is evaluated during during the processing run okay it is possible in CMake to set custom commands and there are two ways of commands the commands can either be added with a function or the module come on okay and the difference between a function and the macro is like the difference in C++ and this one is probably something that not many people of you knew when you create a command and a previous command with that name already exists then the the old command is accessible with a underscore prefix okay and we will see how how interesting what interesting use case is that allows so this is how we define a custom function in in CMake we use the function commands and the name and any other of new input arguments and it because it introduces the scope variables that you set inside that function I just valid inside that function unless you set it using this parent scope keyword then it's the variable fit in the in the place right is called in inside the function we have the variables available variables that we gave names in the argument list here but also the argument count it's the total number of arguments the RV is the is the number of arguments that the actual list of arguments then arc n those are is the list of arguments that we haven't assigned names to and then there's our 0 r 1 r 2 up to arc 9 and this allows us to support optional arguments for example in this case here dollar curly brace output closing curly brace expands to bar however if you define a macro this does not introduce an individual scope and those those tools those things here are not treated as variables but there are just text replacements most of the time this does not really make a big difference the only difference is when you check whether the variable with the name input exists then in this case here it doesn't because it's not a variable can you imagine from what you what you heard now when to add a macro and when you use a function then you should use what i Function yeah but with the marshal I mean since it doesn't introduce a new scope when you when you define a variable here it's also available in the parent scope so therefore the guideline that I will give you is create macros if you want to wrap a command that has output parameters because you don't know what this command will fit in its current scope you don't know what to forward so you can wrap it as a macro and then it will have the same side effects as the wrap commands but otherwise you should create a function and avoid side effects okay so now you learn you know how to add custom functions and custom commands and see make but at some point in time you may realize actually was a bad idea to add it so how do we evolve we make code you know you want to reduce and we you want to remove the custom commands but it's probably used all over the codebase already so how do you remove it without breaking the code you want to add some form of deprecation mechanism so this is what I would do create a macro in this case because we want to have the same side effects as the ref unction inside that module we call it application message and then run the original command and which simply forward all the arguments similarly for variables it's a little bit more complicated we want to make sure that the hello variable is no longer used so we want to warn when someone accesses that variable this is interesting built-in command and seemed a across variable watch you can register a function that will be executed whenever the variable is accept so in this case here we we define a function called deprecated var we give it which variable and which access whether it was read or write and then we check whether it was read access and in this case we simply print out the duplication message that the variable is deprecated and deprecating custom commands and variables is probably what you should do because variables are so C make two dots eight twelve and modern C make is all about targets and properties so we call so we talk about targets and properties next okay you see here in this code there are no variables at all right what do you probably see in insieme code that you find is you define a list called source and it has all those source files and then depending on the platform you add some variables values to that list and also you have a list of linked libraries and then in the end you call add library and target linked libraries with all those all those variables but you see here it's absolutely not necessary we can add the library and then we add linked libraries and then later depending on the platform we can simply add additional sources or or add additional line linked libraries so this is much more robust than using variables because if you make a typo in a variable it just will give you an empty string and a view and you have no way to to debug that so therefore avoid custom variables in the arguments of project commands it also has an implication if you're not allowed to use variables this of course also means you means you don't use file glop in in projects how many use has some unused file ops and see make yeah do you think it's a good idea it's useful what is the problem right so the the the answer was so the answer that was a comment of white is useful because you can simply add a file and so for example you create a file from your IDE and then depending on which IDE you use for example QT creator will simply run CMake always but not all IDs do that and therefore you will you will not get the effect of adding that file the fundamental problem of that is that see make is not a build system see make is a build system generator I would say file globbing in a build system is nice because when you trigger the build system you will evaluate the globbing expression and you will get that list of files but with CMake is different see make you you when you generate the build system using C make it evaluates that globbing expression gives you a list of files but then in the actual build system you will have that list of files so when you trigger the build system it will have no idea that something actually changed well you may think it might be nice if if C make instead of evaluating that gloving expression would simply forward their globing expression to the build system so when you trigger the build system that clobbering expression is evaluated but that's not supported by all build systems so I did what we possibly make but it would not be possible in Visual Studio for example and since the C make wants to have wants to be that common denominator it's not it's just not possible therefore fire globbing is nice and C make scripting mode because when you when you execute a script this is actually what your trigger they're using that globing expression will always evaluate it but you should just not use it in C night projects okay so um C make has that concept of targets and it inherits that terminology for make files but you should you may imagine this as object-oriented concept so you have a constructor like add executable or add library that creates an object basically and those targets they have properties that too many to list here but just imagine those are member variables and then you have member functions like those for generic ones here to to set properties by name but also those commands here like target compiled definitions target compiled features target compiled options target include directories target linked libraries and target sources so calling such a function will modify the member variables of that object imagine like this by the way forget those commands if you if you ever ever use them refactor your code so you no longer use them if you've never heard about them or use them good ok because those commands you operate on them on the directory level ok include directories for example sets a directory property on the current directory and all the targets that are like like libraries that are created inside that directory will inherit those those properties but it just makes makes it complicated to understand therefore it's much much better to just operate on the targets directly so here's an example of one of those member functions target compile features all those commands use those keywords public and private and also interface you know see make commands they just take a list of strings so when you use your during editor to indent this it would probably all on a single line ok I usually use the indentation like this that I put the keywords indented one level and all below indented another levels so what we actually do here is we set the compile features and the interface compile features property so everything that is below public will be added both to compile features and enterprise compliance and everything that is private we'll just be added to compile features what this actually does adding the compiled features you're telling semen yeah okay what this actually does it it tells scenic about the language features that you need inside that library for example okay so that means in my public header files let's say I use some strongly type enum and inside my implementation I use lambdas and range mains for loops the other yeah what I've seen previously many people think okay my code requires C++ 11 so I set the appropriate compile of compiler option to the command the compiler command line that this passes - stood equals iblis plus 11 but this will break in the future that's for guaranteed because there's those requirement they're also fulfilled in c++ 14 and 17 right and also the compiled flag is not the same on all compilers so it's much better to tell us we make those are require my requirements you figure out what compiled lecture should I use yeah this one took what yeah okay I can come to that bag later so because basically the replacement for at compile options the include part okay we talked about that later okay because cynic has built specification and user requirements this is something that is actually that was actually inspired from boost bills usage requirements okay because you you've seen previously we have this interface and non interface target properties so the non interface properties define the build specification of that target and the interface properties define the user requirement of that target okay so when we call a command like target include directories and we use the private and interface and public keywords so the private keywords will populate the non interface property because it's for just for that target the interface keyword populates the interface property that means the bills are them they're the usage requirements and the public keyword populates both so it means it's both valid for the target itself plus all the targets that link against this so when you use the commands target link libraries you express the direct dependencies and this command also then resolves all the transitive dependencies so to answer your question in boost you would add a let's say a library called boost file system and you set the public include directories for boost file system so anyone who links against boost file system will get the correct include directory that's that's why I said it's all about targets and properties yeah I will get to this yeah yeah yeah I will get to do it thank you thank you so the question was about boost there is a fine package for boost and it's very difficult to use and I will come to that later so an example of target link libraries um I use target link libraries public through a public bar and private cowl so that means that the target bar is added to the link libraries and to the interface link libraries because it's public the target cow is added to the link libraries it's just the build specification of that target but that's not all because it effectively also adds all the interface property of bar to the their corresponding properties and interface properties and it also effectively adds all the interface properties of cow to property not to in this case not to the interface property because it's private I say effectively because it's not really what this command does it's really what is done later when the dependencies are resolved transitively because those targets may not be defined yet right so those are transitively evolved after all targets are defined and very important also as the this generator expression here to the interface link libraries because imagine m if a library is a static library and it depends on another library and you want to link that static library then on the command line you you will see that both the library that you directly depend on plus the dependency of that library appear in the command line but in C make you just Express the the abstract interfaces and therefore C make needs to know I mean you need to know that this talk is linked only but the include directories for example are not transitively added to this target food here yeah okay this is a target that is defined in some other place and the syntax why we have this Colin Colin here is that for legacy reason when you use target link libraries and the name and that name is the name of a target then then see Mike just assumes and if that is a target it will see Mike will be able to resolve the dependency if it is not a target it will assume it's probably a library it will at minus L name of name what you provided so the linker command line okay so when you made a typo there it the linker will fail right and this Colin Colin syntax here cannot be a valid file name which means it has to be a target which means when you make a typo here and that target does not exist you will get an error during C make generation and not after compilation during linking no no that does not have to have to be the same name it would be nice to have boost Colin Colin file system for example so we can't continue more question yeah yes it's an you can imagine it at the name space yes I will come to that later so the question was what I come to knit later where do we define the name space it's also possible to have libraries that are pure users requirements like when we when you create a library of the interface type this is actually not a library so you it's impossible to set to set a build specification like public or private include directories for example for that this is just a pure requirement okay in this for example we create a library and we define target compile definition bar equal one so every every executable or library that links against bar will have that variable defined this is very useful for header only libraries you create a header only library as a as just the pure interface you ask the target include directories for that library so that everybody who will link it's not really linking but it's expressing a dependency on that library we'll get a great include directory okay however please don't abuse requirements for example adding W all to the to the compiler flags is not a requirement to build that project so now we talk about project boundaries how you should link against external libraries I would say always like this you you call find package with the name you want to find you may require a particular version number and you can say whether this is optional over you whether you actually require it here in this case we say who is required and then this fine package here will import a target in the foo namespace of the name foo and then we use that always like this now the question is if who is a static library and it depends on other libraries how should it look like this how it should look like in this case sorry it should look like exactly the same way right if who is the head of only library how should it look like in this case the same way right you're beginning to understand what I mean when I say always ok but but now the question is how does how it is whether this who actually come from when you call fine package you may know that this will then search for fine module this is for example a fine module for four full it looks like this it searches for the include directory it searches for the library marks those variables that have advanced so they do not appear in the cache editor then this is some some standard handling this Rick this code here handles the required keywords in the version number and then you have this year if who is found which is which is set by footprint but by fine package handle standard arcs and the target food colon colon food is not defined then we create an imported library here with that name we set the necessary target properties and we're done ok this is a very very basic example question I mean if I ever at the version it would not fit read the reason readable on the slide here I will show another example unknown here means we do not know whether it's the question was what does unknown mean in this case it means we do not know whether this is a static or a shared library ok so this is a very basic example it does not handle the version number it does not handle different configurations you can imagine there may be debug and release config if I produce ivory no no no no sorry I will come to that this this is this is what many people this is what many people do they think they are proved there are library authors they need to provide something like this no this is just the basic example of how fine modules look like okay so like I said this does not handle many case it is it does not handle the version number it does not handle different configurations for example we want to make sure that in the debug build we link against the debug library and the release build we want to relink against the Reis library this is not done here and also we just have some properties that we set here this may not really be the actual usage requirement that was defined by the library author right in reality those five modules look much more like this so I'm not sure about the resolution can you read this in the back okay so this is this is much much more and this is the example for fine PNG and the fine module for PNG is served as a good example right but you get the point here that it's way too complicated and the the most important part is that it's a lot of guessing so because the the library author knows what the dependencies are and then when he creates a package that information is completely thrown away and then we have these lots of guessing what the actual interface requirement was so now I come to the point what you should do is library authors because fine package does not look for fine modules before it does it looks for fine for CMake packages and you can use see make to generate those packages so the question is so the guideline is use the fine module for third-party libraries that are not built with scenic okay for example P&G is not good with CMake and therefore CMake provides a fine module for P&G okay and and this is exactly the point many authors ask where should I when I write a fine module for my library where should I put it yes do you mean as a client of that library the question as a provider of such as a provider of a header only library you should still use a build system because you don't have just that library you also have tests and documentation and all that stuff you don't deliver it but you should still I mean you should in any case you should use a build system even for header only libraries because please because you want to run the test basically you don't deliver that but you still use a build system and you still use a target in your build system for that header only library because those will be consumed by the tests and since you already have this this interface target you can export it to the clients using scenic you see here this text here is rather at the top you can imagine there will be something else coming this is not actually accurate it's not about whether you use C make or not it's whether you want to use you want to support clients too you see Meg okay for example boost does not support clients - you see Meg seen that there is a find module for boost that comes with C make so C make supports clients to use boost in C make but that the support really comes from C make it does not come from boost cute on the other hand is not using C Meg at all it uses their own build system but they support clients to use C make so they provides somatic packages that can be included using fine package so you can write in your project find package cute five but there's no fine module for cute five this is a fun if a see make package that is shipped with cute therefore if you need to write a fine module for a third-party library report this as a book to the authors because see make is SEMA common into the industry most people use it in if a library author does not support it it's a problem ok so I will show how to export the library interface using CMake and then you can when you want to support another build system you can look what Seema generates and generate the same thing for example add support for that in in boost bills so we have this library foo it may be a header only library right so it may be a static library or it may be a shared library whatever we have some user requirements in this example here we say we link against bar bar may also be ahead of only library we don't care we just say this is the record is a requirement of that so we have that defined okay now we installed the library and we define the location where the the library destination that which means the the archives and shared and as all files should appear and so the the archives appear in and lip and the libraries appear in lip the runtime component will end up in bin so imagine on Windows this will be the DLL and we say the include destination is include and this is expert this is the interesting part here we store all that in an export set called foo targets and we do not install the lat only install the library we also install the export set we can we say we install the export set called foo targets in this file here so this file will be generated by semuc we say then the library should appear in the namespace called a foo colon colon and we say where this destination for the file is so it's a it's a common pattern to use lips let's see make a slash and then the name of the of the component this is one thing the other thing is the for the version there see make provides this is this helper helper module here where you can generate a file that that checks the version number and we also write another file by hand because we said before we saw before here the the library food depends on bar okay so see make when we when we create this export set then see make knows that it depends on a target code bar but it doesn't know where the target comes from so we have to give the client this information as well so this is a file that we write by hand we include this module here and we call the function find dependency with a bar the correct version number and then we include the file that was generated by C make and both the word and file plus the file that we wrote by hand we install also to the same location this is some boilerplate and I would like so I would wish this would be more easier but at least this is straightforward whereas the find module was just guessing where is what is located this is basically what all libraries need right always the same thing no matter how many dependencies I have no matter whether they sets include directories or compile flags or compile definitions or whatever basically this is what you need in every case the destination is that a relatively dilute its relative to the installation directory so on Linux that would mean no no it's related to the installation route it's not a Simic install prefix and when you simply run make install then the installation routes will be the see make install prefix but when you use C packs to create a package then it will install to a temporary directory and then take the content from that directory and put it into an archive so it's really a relative path this will appear in your main C matrix file here I mean it does not have to be the main one it can be in a subdirectory you can put everything in one file that's correct the question is why to include the food targets that we make okay let's say this file here is called foo config it's a and it contains just this d3 line ah sorry sorry sir this is in your scenic ok this is the place where you create the the executed other and the library this is directly following ok this here is still the same file and this is a separate file okay because this is the file that you install so you basically install several files this one the ones that is generated here which is called foo config version and the food targets and this one depends on an on a set of files right this this targets is then a file that includes all the the target definitions per configuration so one for debug one for release even more if you have more configurations very important import and export the right definition for example your target include directory maybe when you build it may be different than when you install when you install it's probably just includes so we use this generator expression that in the install interface we use include and in the build interface we use directory in the current build directory and also in the binary directory because imagine you have generated files right when you in your in your build tree you have a directory where you put all your include files you will have at the include directory also in the in the build directory because you want to generate a I don't know configured age or your your version information or whatever but when you install you install everything to the same location so in your in your builds the indium using the build interface you will have two directories and in the install interface you just have one ok so now we know how to how to tell see make all the information that should appear in the package now let's actually create packages creating packages is done with CPEC CPEC is a tool that comes together with CMake and using it is really straightforward there's a C pack config dot C make but it basically has all just a list of set commands where you set some variables so I will not go into much detail of that I just want to give you a hint how I usually do use it because you can also set you the set command instead they've got the CPAC variables in your cmakelists txt file and then include the CPAC module and this module will then take all those variables and create a CPA config for you and the way I usually do it I wrote my own CP a config and in that file I include the one that is generated from Simek and this allows me to set additional variables in the CPEC configuration that i do not want to appear in my project for example i can't set the the license for and not the license file but for example the the email address that customers should use to contact the authors or so doesn't have to be a P doesn't have to appear in the project definition I put it in a separate file okay and another interesting use case is this in C pack there is this variable called C pack install C make projects now the documentation of that command says it's a list of four values but it's not a list of four values it's a list of quadruples okay because you can install several C make projects and the first value is always the build directory then the project name I don't know why this is required I think it's redundant then the project components and the location where it should appear in the package and a way of using this first we need to make some changes in the see make list file we set a C make debug post fix okay so and we set it to minus D so imagine you have a library called foo the actual file name will be lip food dot a for example in release bills but in debug build it will be called Lib foo - da so they don't do not overlap I can install both to the same directory and they will have different file names and then I use the following C PAC config I include second step is I create separate build trees for debug and release and the third step is I create the C PAC config here I just include one of the two either debug or release just to get all the information that C make by default sets and then I override this variable here so I say it should install the full project from the deep a directory take all the components and put it in the root of the package and also from the release directory take the full project all components of that and put it also in the root of the same package so question the bill tree the question was whether the bill tree is no hard-coded no it's not this is this is just a file that I generate using so this is a sea pen config file I generate this when I make it want to make a package right I usually have a script where I can tell it ok here's the source directory so then the script will create two builds trees configure both compile both and then one level above create this file run C pack on that and it will take the debug and the release bills put it in the same package the day set file is not committed okay so since we now can import packages and also export packages we basically have all the building blocks to create a package management system I will not present one but I will show my requirements for package manager because there are several approaches of making a package management system in CMake and I think I haven't found one that really fulfills all my requirements my requirements is I want to support system packages so when when Lipsy is installed on a system and a version that i can use i see no reason why the package manager should download an additional one then it should we support both pre build libraries and also building dependencies as sub projects so I can I want to mix and match take this dependency pre build this other dependency a source ant yet another important requirement it should not require any changes to my projects so what I've shown previously things like using add library find package cetera that is the way she makes should be used so when you have a package manager that requires you to use I don't know PM at targets or p.m. at library instead of at library or or downloads dependency instead of fine package no I'm against that really it should be just the way I just presented and I can guarantee it is possible I said before external libraries should always be used like this and now we go through all the three steps it should support system packages sub projects and downloading as pre-built libraries so let's have a look if you just write this then system packages will work out of the box okay because because see make will just be able to find it pre build libraries will not work out of the box because see make doesn't know where the pre-built one is put okay so let's imagine you have a package manager that puts all the dependencies that are downloads to an own directory just tell this director to see make so you put it into you set the variable see make prefix path and this is where see make will then look for the dependencies in addition to the system with sub projects we have two problems because imagine this you have this master build and it has two directories here this is directory fool and this is your actual application and the actual application will call find package foo but we need to some way to prohibit that because because foo is Biff part of the same super project basically so therefore we need to turn find package into a no op and we also need to make sure that we have this this naming convention here with a Colin Colin so let's talk about the second step first imagine simply as an additional guideline to your do to your C make projects whenever you whenever you export a library in a namespace foo for example your export the library fool in a namespace fool at the same time also create an alias full-color column foo that means that using foo inside the same bill tree will look the same as using it as an external library I put this also here as a guideline and then for the other part I'll show you this trick here so this is the top level see my project we set the prefix path and we set a list of sub projects that we know are billed as sub projects and then we override to find package commands where we simply check whether the argue the first argument appears in this list and only if it doesn't we call the actual find package okay simple as that and then and then we out all the sub directories exactly question it's not in why do I want to add a subdirectory when it's in the system no it's not in the system it is as a sub-project you see here foo is in the list and sub-project question again a supervisor researcher Wendy why write the question is this allows a directory structure a directory hierarchy that is one level deep what what to do when I have one thing our super project or super project I imagine imagine a package manager right so you basically give the package manager a list of dependencies that you have so and this and then the package manager goes to web service download ace on file calculates the transitive list of dependencies and in the end it will have the information three lists it will know which libraries are are available in this system it will know which libraries are pre-built and can be installed as binary packages can be downloaded as binary package and it will know which libraries need to be built as a sub project and then for the system libraries it doesn't do anything because they work out of the box for the libraries that need to can be downloaded as pre-built ones it will download them put them all in the same directory in the prefix for example and for the list of yes basically in the end it will it will then I mean for the for the others that are that are need to be billed as sub projects it will download them or it will clone them to an own directory and then it will generate this file alright it will set the CM a prefix path it will set the S sub projects list to all the things that are built as a sub project it will add this magic command and it then and then it will call a sub directory for all the things that I've built as a project so we will never have a deep hierarchy we always have this flat tree which is one level deep does answer so I explained this how this works right but and it does not matter whether whether foo actually provides a package config or provides or or if C make or our own project provides a fine module it will work in either case right so it will if it's a system package then fine package will either use the package the full config or the defined full same in the real directory and we when we build it as a sub project and fine package simply does nothing and the Fargate foo Colin Colin foo is part of the project anyway so it works in all three cases questions which one would find food at C Meg I mean if foo has not distribute the food configured CMake then the the project depends on foo will need to have the find food assuming or it can comes from or it can come from from scenic itself for example for PNG it's available in scenic so there are multiple multiple ways where it is I'm just saying that when the library can be built standalone it is also possible to use it using this concept of a package manager and it does not require any changes yeah it's a single the prefix path is a single is a single variable that contains a list of strings so it can be list of directories the question is what do I do with targets that that do not follow this new approach of of target dependencies etc okay the first step is to give this presentation and the next the next step is you all fix your targets and then the third step is we build this package manager right in in the worst case you can we can wrap the module and edit it edit the target also okay C test C test is also quite straightforward so C tests require M allows CTS scripting so we have for example C make file called build or C make and we can run this with feet has minus s built o Simek and here we have those come on C test starts it has configures it has built test coverage mmm check and submit the important takeaway here is that this is the place where you should configure your CI builds so if you if you require special special compiled flags for example here we want to we want to build with coverage information so that we can later run G cough or let's say you want to well grind that also requires other compiled flags or we want to use the thread sanitizer or undefined behavior sanitizer whatever there's all this information how it's built on the CI machine should be outside of your project right I have seen previously projects that run that that for example find valgrind and then Ruairi resist additional tests in addition to the usual tests that run the same test with valgrind it's absolutely not necessary this information should be kept outside of the project put into the the C test builds because C test knows how to run coverage how to run mem sheykh even how to parse the outputs of those tools so it then can send this information to see - question did the question was this file on the slide whether that's the build arsenic or another file I mean don't confuse the the entry point for a project is called C test and C make lists txt this is just some file that I usually put directly on the CIA machine so it's not committed into the into the project because it will be completely project agnostic I use the same build definition for wide range of projects so like I said see test scripts are the right place for CI specific settings keep that information out of the project it just makes it simpler a discussion that we have that during the week is if I build targets if I build project as a sub project and how do I make sure that when I run the tests I just run my tests and not everything we can do this by following a naming convention okay whenever you add a test if you make to follow such a naming convention here prefix it with a project name for example and then when you run to want to run all the tests we can simply say C test minus R which means regular expression match all the tests that match for process in parallel and give us verbose output in the case the test failed and the same country say m-- settings can also be passed directly in the c test script so this is a guideline follow a naming convention for tests sometimes he wants to have a test that fails to compile something right so and we can do it like this we add a library that's just consists of a single source file we register a test and the test command will be to run CMake - - built in the current build directory which means built this project but from our projectors their target so it will try to build this target which is normally excluded from from all right so when you run make it will not build it only if you run this test see Michael actually try to test it - to build this one and then we can set a target and we can set a test property that this command should fail but that may be problematic because it may fail for the wrong reason maybe maybe it fails to compile because it doesn't find the correct header file but you act what you actually wants to use once to have is that it fails to compile because of some static assertion fires therefore it's better to set the property that this only this test here only passes if this regular expression matches and here we directly set the output of the of the static assertion then there is this interesting thing when you when you cross compile you can this is this variable here called Sima cross-compiling emulator you can set this to an emulator and this will be prefixed to the command line if the first command in the command identifier in a command line of the test is a known target to see make so it knows this is something that I currently built so it prefixes it with the trust compiling emulator and this allows us to run unit tests in wine when you cross compile for Windows for example or in in cumulus compile for armed something that I did previously I simply set it to a script so the cross compiling emulator is the shell script that takes the binary as copies is to turn embed a platform execute sit there and then report the results so enter more cross compiling cross compiling is done in C make using to chain files and this is about the extent that you should have in a in a tool chain file and C make everything above that is too complicated right so we set the target system we set the compilers we said where to find the dependencies and maybe you also set a cross compiling emulator try to Google for the cross compiling script for Android it's a cup and it's thousands of lines or so but this is what I think should be the extent of a to change file guideline don't put logic in to change files right really as a single to chain file per build target platform that you want to support so this is my favorite thing static analysis this is something I recently contributed to see Meg but but it starts a little bit more generic question what is your opinion about treating warnings as arrows it's a good thing it's a good thing everybody agrees but Preston how do you do it how do you treat warning whose arrows no no I mean what what mechanism do you use to treat warnings of errors you use the compile flag w error for example okay but that's that really I mean if you if you do that do you really treat warning concerns I mean okay let's answer this and let's ask a different question first how do you treat errors sir no how do you fix them how do you how do you think how do you treat errors if if the error I mean if the build breaks what do you do right I mean basically you basically do this right when you have you know when you have errors you fix them or you reject pull requests or you hold up releases if the current branch doesn't build this is how you treat errors so what is the correct warnings to treat and what is the correct answer how to treat one of those errors I would say it's this okay and I would also say I would never pass w error to the compiler because if you do the compiler treats warnings as errors so you can no longer treat them warning as errors because you will no longer get any warnings or you get this errors and an adding W error also causes some all kinds of pain you cannot enable it unless you already reached zero warnings you cannot increase the warning level unless you already fix all warnings introduced by that level you cannot upgrade your compiler unless you already fixed all your warnings but compiler reports the warning level you cannot update dependencies on unless you already ported your code away from that from the symbols that are non marks is deprecated and you cannot even mark your own internal code is deprecated if it's still used right so and if it's no longer used why market is deprecated you just remove it so I would not use W error instead I think this is a much better approach treat new warnings as errors so follow this process here at the beginning of a development cycle for example of Sprint allowing new warnings to be introduced let's say you can increase the warning level you can explicitly enable new warnings you can update the compiler can update dependency and mark your own symbols as deprecated so once you did this you say ok now we introduce new warnings now we enter a stage where we we just analyze the Delta when someone removes the warning great when the warning level stays the same ok if it increases forbidden all right so we burned down the number of warnings until we reach 0 and then we can repeat the whole process yep the question is whether I have a good CI tool for tracking that I think it can be done with C test but I'm not sure so it's not solved completely yet but I think this is the process that we that we should use to allow r2 to fix the warnings and introduce warnings and and so on and if we have that then we can think about new ways of introducing warnings I mean because the compiler is not the only tool ok for example clang tidy anybody heard about it yeah hopefully cpp lint is a also coat checker from Google include what you use is a tool that analyzes the ink suits and can report which includes are necessary and unnecessary clays II this is something that comes out of the KDE community it's it's a compiler wrapper that that finds cedrus does anti-pattern and anti-patterns so now we go back to see Meg because we have target properties for static analysis okay there's there's the target property for example cxx clanked ID c XX cpp lint CX x include what you use and there's also link what you use and what those three properties are at the top do they run the static analysis to will right before the compiler so the interesting thing here is that they're all the warnings from those tools will appear directly in your compiler output so therefore you you you would get IDE support out of the box right and also the Diagnostics are visible in in C - when you use it has to submit the test results to see - okay lang here is a black folder can be either B C or C++ and all those properties are initialized with a variable prefixed with C make okay but there's a limitation most of those tools report the Diagnostics for a current source files plus the associated header file so when you have header files that are not do not have any associated source file you will not see the warnings right many of those tool tools allow you to set a custom header filter which just defaults to the current to the associated source file name but you can relax that the problem is now when you when you do then you get the warning for the same header as multiple times it's much better I think I think I've seen this in a talk from John Lakers this guideline here for each header file there's an Associated source file that includes this header file at the top even if that source file would otherwise be empty right and the comment from the audience is that this may be a compiler warning that's that's that but at least the six of those tools spread so it would probably be a good idea to to have that and then explicitly disable to compile a warning that one's about that question yes no no Jason no I mean they are extensible in and the question is how extensible all those target properties at least you can set them on on individual targets right and and you can organize your code in such a way that you want to have different warning for different targets no no just those okay the question is can you add your own for example your own check your own properties for for your own setting and analysis tool no the answer's no those are hard-coded Nassim a gentler the command from Chandler is you should use you should when you use the W error from the beginning then you basically always have have zero warnings I disagree with that because I do not always start from scratch most of the time I'm introduced into an existing code base that has lots of warnings and also I have half those cases where I want to applicated upgrade the compiler which introduces new warnings this is something I simply cannot ignore okay then okay okay I think this this is the discuss I do not agree I think we can have the discussion afterwards Peter if you for example if you want to introduce flex lint it will not work that's correct yeah so I mean you can maybe abuse those if they if they accept the same compile flexor so but really those are hard-coded you see me so like like I said this this guideline here from translators I think make sense I had one set of code base where I really had this problem how to add those source files simple bare script that I came up with a kind of long I explained how it does it but it really solved the problem okay and this is an example of how how to do how to control this from the outside I can take any seem a project and I configure it like this I set the C compiler to clang the C++ compiler to crazy and a from outside sets clang tidy and also include what you use and that's absolutely not necessary to put any of that into the project definition itself I cannot just control it from outside for use in my project and this is supported out-of-the-box by all the iges because it will appear directly in the build output and if the IDE supports fix it hints from from plan then it will simply think that the fix it hints that receives from client ID have the same origin basically basically [Applause] ok I have some slides afterwards the if you're interested I will cover also package config if you're interested I can show them yeah I mean maybe we have additional questions first yeah on one of your first slides you mentioned there is preferable to use target compile features or explicitly using by the flag now what happens if I have two component one uses a C++ event feature the other doesn't will I not get possible linker errors the teammates chose different compiler flags very good question so the question is if I have two different libraries one requires compile flags that I mean has compiled flags that require cm have compiler features that require C++ 11 and another one requires a C++ 14 then C make will figured out that's the whole point of that I mean if I if I set the compiler flags explicitly they will probably collide but if I just tell C make what my requirements are and C make will figure out the compiler flags it will make sure does not clash yes it also works on that private yeah I have not used Conan I'm aware of it but I haven't really evaluated you does it does it does it so does it solve those requirements that I have okay no more questions then I can give you some additional information okay so this is my my personal wish list of I think what the future of CMake might look like and I'm for all of the following ideas I have already started a prototype if you want to contribute to welcome you can talk to me so one thing so disclaimer there's absolutely no guarantee that any of that will be added to see make it at all okay but what I would like to have is precompiled headers as a user requirement so imagine this right I have a command called target precompiled headers I give it a list of headers I mean public headers private headers interface headers so see Meg has all this information what what header is it needs to pre compiled for each target and now this should we work internally see make may be able to calculate a list of headers per configuration and per language for the build specification of each target and then it will generate a file that basically includes all those headers and this generated header file is then used I mean I mean this has been used in the build system I mean see make then tells the build system to pre compile this header and also to force include this header in all compilation units and since its force included it's absolutely not required that I have something like includes today of X it will work if my build system supports pre-compile headers it will be maybe a little bit slow and it will be maybe a little bit faster if it doesn't support them it may be a little bit slower but it does not affect the the project in any other way so you do not have to change the source code the reason why I think that would be nice is imagine let's say you use you create an interface library called booth SEO you say okay boost slash SEO should be the public public or interface pre-compile header so any project that uses boost SEO will simply out-of-the-box pre-compile that header without adding any configuration anyway everybody agrees this would be nice okay [Music] so here this one more languages see make score is actually language agnostic when you when you request see make for C++ support I mean which is the default but you can turn it off when you enable a language then see make will look into the module path and parse the necessary definitions would find say how to treat this morning so seem a can be used with D by simply putting the necessary files in C make module path and but there's a limitation this only works really for the C and C++ model where you have a source file which is translated to an object file and then those object files are linked together if you imagine if we tweak that in a in a way that we say okay a source file that the output of source file may be another source file which is then compiled to an object file and then linked just having that would allow for example treating protobuf as a library as a language cute resources or any other IDL but it will also allow using c make for all those list of libraries that and list of languages that are listed here got it from wikipedia those are all the languages that for compiling to C or C++ questions there's already I mean there's already Java support in cynic but not cross compiling to I mean using this way we could cross compile Java to C++ and then actually build C++ I just took the list of from Wikipedia with all the languages that support cross compiling I don't even know all those languages so I have no can give no guarantee that this would actually work yes sir so I would the questions whether I would change the boost support in C make I would like to drop it I would like to drop the boost support in fee make but action but instead at C make supports to boost yeah what should change I mean if boost would use C make that will make it easy because then we can simply use C make to export all this in definition if boost does not use C make then we should at least teach boost build how to generate C make project file as you make package files so to answer your question about package config find package currently has two different modes it has a packaged mode and a config mode so it first tries to use the it's first tries to find a C make package if it fails it below to see make fine either the find module for that I would like to add an additional mode that parses package config files and simply generate imported targets okay but I mean look I looked at the implementation of package cons and it looks complicated for three different reasons I mean there are basically three different things that it does it has containers because it's written in C so it requires a see it requires a string it requires and them function for string manipulation hair set et cetera but when you do it in C++ we can we don't need this the other thing is transitively calculating all the dependencies but see make already does that so the only thing that is actually necessary is passed those files alright so if you just pause those files and define generator targets from that see make I mean the usual propagation which will just work and the last thing that I would like to be interested in is I mean currently the the Sameach language processor has its own implementation and there was an approach can be found in the in the wiki to replace that with Lua but the way it was implemented it was just when I mean it wanted to add a lower front end and whenever you call a function in Lua it would translate that to a seem a command and then execute it on the on the CMA processor that command I would do it the other way around I would like to replace the Simek language processor by the lure virtual machine and built the the current language as a front end to lure and this would allow then I mean we would still have the currents finder modules they would still work but it were also possible to write to make modules in Lua and the last thing is for the front end so for the instead of the cmakelists exe file i would like to have a declarative language which allows procedural subroutines so lip UCL would be an option it is that was previously shown that it's possible to extend with lua this is what I wish yeah and then of course you can tell me whether you have additional ideas [Applause]
Info
Channel: CppNow
Views: 166,274
Rating: undefined out of 5
Keywords: Daniel Pfeifer, C++Now 2017, Computer Science (Field), + C (Programming Language), Bash Films, conference video recording services, conference recording services, nationwide conference recording services, conference videography services, conference video recording, conference filming services, conference services, conference recording, conference live streaming, event videographers, capture presentation slides, record presentation slides, event video recording, video services
Id: bsXLMQ6WgIk
Channel Id: undefined
Length: 87min 2sec (5222 seconds)
Published: Tue Jun 06 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.