Esri State & Local Connect | The GIS Professional's Guide to Spatial Analytics

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone and welcome to the fifth installment of esri state and local connect this webinar series is brought to you by esri's state and local government team and was created for gis professionals to help bring you the information you need to raise the bar within your organizations and transform the communities in which you live and work my name is dan higg and i'm currently working out of the esri office in st louis joining me today is rob hathcock hey rob hey dan thank you i'm an account manager uh with the state and local government team and also we've got pyali kundu hey piazza hi everyone i'm piali kundu i'm a solution engineer based out of esri's new york city office supporting the new york city metro region and i also lead esri's racial equity solution engineering team great and we've got nicole grams hey nicole hey good morning good afternoon i'm also a solution engineer based out of the st louis office and i work with state governments great thanks nicole and yuri patowski yuri here hey everyone thanks for joining us today my name's yuri patowski i'm a solution engineer based out of the charlotte north carolina office and i help the communities of north carolina and south carolina thanks shuri and last but certainly not least we've got tim loftus hey tim hello i'm tim loftis i'm a solution engineer out of our philadelphia office and i support state governments in new jersey pennsylvania and delaware thanks tim so we're really glad you could join us today for this webinar called the gis professional's guide to spatial analytics during today's webinar we're going to be talking about ready-to-use arcgis capabilities for spatial analytics capabilities that can help you uncover hidden patterns make predictions and understand changes over time to help you improve government services and take meaning and take meaningful action through data driven decision making and just a reminder we are going to be recording today's webinar all attendees will receive an email with a link to the recording if you want to share the recording or just re-watch it we also welcome any questions you have and we'll have some time at the end of the session to answer a few of them if you'd like to submit a question you can do this at any time during the webinar using the go to webinar questions pane all right let's go ahead and get started so for many of us spatial analysis is the most intriguing and remarkable aspect of gis and arcgis provides a comprehensive collection of more than 2000 spatial analysis tools that extend your ability to answer complex spatial questions during the webinar we're going to use a series of brief demonstrations to highlight capabilities and use cases for data engineering visualization and exploration spatial analysis machine learning and artificial intelligence modeling and scripting and do this in the context of addressing some challenges in our communities now we're not going to be focusing on big data analytics today but we will be looking to focus on this topic in depth in a future state and local connect webinar now many of you are using these capabilities now and there are numerous examples of great data driven decision making we could explore in state and local government but i wanted to highlight just a few firstly there's the unified government of wyandotte county there they did block by block analysis of total revenues versus infrastructure costs to demonstrate and make better decisions around public infrastructure you've got the city of austin texas where they discovered possible planting space by using geoprocessing to strip away areas not suitable for tree planting and this allowed tree planting groups to look beyond standard measures of existing canopy and envision what's possible and then there is boulder color boulder county colorado there they use spatial pattern mining to visualize where covid19 related symptoms were indicated in a 911 first responders do their work more safely and that's just a few examples of the great work being done by some of our state and local government customers now i'm going to turn it over to my colleague rob and he's going to step you through a solution that leverages spatial analytics to address a very timely problem rob thanks dan like many states here in georgia we are continually looking for innovative methods to evaluate covid19 planning scenarios this is just one example into how a vaccine distribution plan can be visualized i'd like to add a bit more context before we get started in mid-september the cdc released an interim playbook for how jurisdictions can optimally distribute the covet 19 vaccine the playbook contains a host of guidelines for training and planning but a major portion is dedicated to breaking the population down into three phases or groups however the need to serve the georgia population regionally traditional analysis now leans heavily on spatial analysis phase one is intended to provide the vaccine to those who need it most this is going to include healthcare workers who are directly or indirectly interacting with patients and in a slightly delayed phase of the critical workforce this is everyone from food packaging to teachers to child care providers this phase also includes everyone over the age of 65. phase two is going to continue the focus on those at a heightened risk for severe coping 19 we're looking at those with asthma diabetes and those in group quarters like university students and those without insurance just to name a few finally phase three is the remainder of the general population this is everyone has who's at a lower risk for severe covet 19. from a technical perspective getting to this point was not a massive undertaking remember it's the spatial analysis that laid the foundation not the resulting technology as you can see in this tab it's the spatial we've outlined how we've interpreted the playbook to get these values for each population phase by using data in the arcgis ecosystem we'll return to this topic with our next presenter in this application we're looking at the potential distribution network for phase one this would be those residents who are part of the critical workforce and those at the highest risks for severe covid19 in this sample region we're looking at approximately four million individuals for georgia phase one is going to be a fairly large network so we've used the location allocation tools in arcgis to select the top 40 hospitals that can serve as distribution sites seen here then for each census tract we've identified the nearest distribution site within 60 minutes of driving this referred with this returns a few outputs number one we can estimate the population that might be served by any one of these sites this is illustrated by the size of the circle we can even get a breakdown of the demographics for each site and number two we can identify those census tracts that cannot access a distribution site within 60 minutes of driving here highlighted in red now in this sample region we notice that a large majority of those who cannot reach the distribution sites are on the edges of the state lines this might highlight a need for a data sharing and interstate collaboration also for this demonstration i'm going to show you some of our estimations if your local jurisdiction has specific values on the number of individuals in the designated workforce categories or on healthcare professionals you're encouraged to use that data to supplement this analysis in our about the analysis tab you're going to see that our phase 1 population is an estimate of the number of healthcare workers who directly interface with patients plus long-term care residents this gives us our phase 1a population in phase 1b we use a variable from the esri enrichment tool related to the total workforce while the cdc and cisa offer a specific list of who is critical we made a general estimate that 25 of the workforce might be deemed critical we then subtracted the healthcare workers that were accounted for earlier we used another variable from the data browser to also add in the total senior population finally to simulate rollover this is those that were eligible for phase one that actually ended up getting vaccinated later we multiplied this phase by 75 percent and roll part of the population in phase two in phase two we use market analysis variables from the data browser this includes a percentage of population using insulin weight loss drugs and inhalers and additionally we've added in those group quarters and those without insurance then we added in the remaining 25 from phase one finally in phase three we subtracted phases one from two from the total population this gives us everyone else again we hope to demonstrate here that arcgis is a great platform for this type of segmentation the necessary analysis and that there's a whole suite of data products available to help you make the determinations now i'm going to give the floor to piali an esri solution engineer who is going to take you on a deeper dive into spatial analysis thanks rob the first step in starting any analysis is compiling and cleaning your input data sets arcgis provides access to many data engineering tools and capabilities including on-demand updated demographic and business layers analytics and data enrichment tools and hosted data science compute that can clean up and greatly enhance your existing enterprise data let's explore some newer tools and capabilities in the arcgis online cloud taking a closer look at the pandemic and availability of outdoor recreation and activities for community health and mental well-being prolonged quarantine and limited indoor activity options renewed attention to broader systemic issues of unequal distribution of available good quality parks and outdoor recreational services not only did kovid 19 disproportionately affect black and brown communities but poorer predominantly by park neighborhoods had less outdoor facilities and services for socially distanced activities during quarantine the gotham center for new york city history noted in general poorer neighborhoods often have fewer trees and higher rates of crime and health problems like asthma as an analyst i want to understand and identify where some of these areas are based on environmental demographic and socioeconomic factors so they can be targeted and prioritized for new programs and opportunities we can start by building a comprehensive data set combining different data points based on relevant criteria like bypoc population count income and presence of existing park infrastructure here i have a map of new york city census trucks parks and street trees most gis professionals are familiar with arcgis tabular and spatial joining tools but did you know you can also join data from remote sources using field calculations in arcade arcade is a portable lightweight and secure expression language that works across arcgis tools and environments including arcgis pro online enterprise runtime and the javascript api arcade allows you to perform complex math manipulate text and establish logical conditions within a layer's visualization labeling pop-up and now field calculations let's start by calculating street tree count per track using a simple intersect formula in our case i can create a feature set object to query the street trees then intersect with each track to get street trees per tracked arcade also works with line and polygon intersects so i can apply a similar technique to calculate area of open part space per tract here we can see that the trees were added and i'm creating a new field and then i'm going to use arcade to calculate the area of open park space that intersects with the track next i'd like to understand the social makeup of the tract like predominant race and income metrics esri's racial equity and social justice hub provides a wealth of ready-to-use data sets maps apps and user stories and use cases that can be leveraged directly in your analyses here i've found a map visualizing the race and ethnicity associated with the lowest median income per track and i'd like to join it to my data as i zoom into the map here i can see that this data exists at various different scales including states and tracks esri's living atlas team already wrote an arcade calculation utilizing acs mean income by race data which i can utilize to incorporate the same calculated data in my trax layer i'm going to add another field this time it's going to be a string field because i wanted to hold the different race categories then i'm going to calculate that field and i'm going to utilize basically just a variation of what the living atlas team had for their formula i'm again using a feature set to query the acs income layer and using a filter statement to match the acs track to my tracked feature we then create an array of the race demographic metrics and evaluate which race demographic has the lowest median income value and return it now let's switch to arcgis online's hosted notebooks environment to complete a few more steps arcgis notebooks is a hosted jupiter notebook environment optimized to work with your arcgis environment items and tools here i'm pulling in the census tracts layer we've been editing with just one click you can also work with analysis tools and standalone files in the notebook in this notebook i'm going to do a bit of data engineering i'm going to start out by importing my libraries and then importing my census tract item and converting it to a spatially enabled data frame so that i can view some more information about it i'm going to add several variables that i don't have track level data for using esri's geo enrichment service geoenrichment provides access to esri's 2020 updated demographic data uses and uses data apportionment to append additional data even to custom geometry you can use esri's data browser to search data sets then select variables and export to json and inject directly into your scripts here i've added 2020 minority population total population and i'm searching for 2020 median income it's important to note that all of these variables are updated to the current date and i'm just exporting those variables as a json array which i can just copy and paste into my script next we're going to run the enrichment tool and we're going to take its results and use a data frame to print out a few of the results so we get an idea of what the table of data looks like and we're also going to print out a list of the fields g enrichment produces many valuable supplementary fields that are useful for evaluating the enrichment results once we've completed our evaluation we can easily remove them for end users like decision makers and the general public now we're going to print out our results and make sure that the table looks good before proceeding finally let's bring all these variables together using some more field calculations and some advanced symbolization techniques i'm adding in my enriched census trax layer that has all of my enriched variables in it and creating a new field this is called social priority and what i want to do is create some sort of an index to bring together all of my different social data points that i've appended in this case i'm going to evaluate whether a tract has a large minority population whether the lowest median income population is bypack and whether the median income is below average i'm going to run the results and get back that index value i'm going to do the same for my environmental variables with a environmental priority field in this case i'm going to evaluate if there is a park within a thousand foot tracked buffer using arcade expressions does the tract itself have a park and are there a reasonable number of street trees about one street tree per thousand square meters i could continue to add different types of variables for example asthma hospitalization rates or mortality or additional environmental programs but for now i want to see what the relationship is between the environmental factors and the social factors in each track i'm selecting my two priority indexes along with minority population and selecting the relationship and size symbology technique the relationship allows me to see two different variables and how they compare where their high highs are low lows are i can adjust the symbology fill and gradient grid size and a lot of other factors here i'm also using a third variable minority population to help you determine what areas would have the largest impact after applying this visualization we can see several regions in brooklyn queens and the bronx start to stick out many of these places appear to have below average median income high minority population and low existing access to park programs in this section we explored many different ways to engineer data in arcgis online and we saw how some new tools like geo enrichment and arcade field calculations can be useful for your data engineering efforts we also saw how tools like smart symbology can provide you some initial robust visual analysis next nicole will show us some more advanced spatial analysis tools in arcgis pro pialli just showed us a wealth of resources and important considerations that enable analysts to have a solid data foundation as we begin to examine a problem through the lens of spatial analysis in this demo i'm going to keep the theme of parks locations but travel a little further south i'll be assuming the role of a gis technician in the city of charlotte north carolina our park system is managed by mecklenburg county so we need to work together to increase our growing population's satisfaction in recreation resources like greenways and parks the gis staff have been asked to help with an assessment on the state of parks in charlotte so i'm going to be using our gis pro 2.7 to perform some descriptive analyses first i want to gauge the availability of parks by calculating the acreage of park space per 1000 residents in my map i have county-wide parks boundaries as well as charlotte city limits i want to start by selecting all of those parks within the city limits using a fundamental select by location operation to make it easy i'm just going to choose the parks that have their center in charlotte because there are some parks that are really long and winding like greenways and i'll create a new selection now if i take a look at the attribute table i see there's a field called some acreage which brings together all the acres for multi-part polygons or distinct parks and i want to sum up all of the selected acres of parks i can do this by creating a summary table so i'll just quickly put that field in there and now i'm going to have an approximation to the total number of parks acres within charlotte so just under 10 000. now from there in theory i could just take the charlotte population divide by a thousand and divide again but what i really want to do is take this process and make it repeatable and use it again so i'm going to do that by using a notebook in arcgis pro now i am not a python expert but using notebooks in pro has helped me improve my skills over time with shortcuts that make learning to code fun and easy for example i can check my geo processing history and copy the python command for any tool that's run successfully and then paste that entire line of code right into my notebook without having to memorize any syntax starting in pro 2.7 markdown headings can also be nested and minimized which makes understanding my workflow easy to read so first i'll complete this workflow that i alluded to in my map by running one cell at a time and i really like having this option for when i'm debugging a process which is pretty inevitable now last but not least i'm going to calculate that final number of acres per 1000 charlotte residents and i get the answer that there are about 11.34 acres of park space now not only have i documented the analysis workflow that i'm developing but i've also automated it for future use so i'm going to use this very same process to look at the city of austin texas which is similar in size and population i've already clipped the relevant parks boundaries to the city limits so i have a little bit less work to do here again i'll find the total acres of park space in austin and round that number to two decimal places and see that there are 17.75 acres of park space per 1 000 austin residents now i have some metrics to report to my boss about park availability to take it a step further i might want to understand the accessibility of parks both in terms of those that can be reached via public transit or those that are embedded within neighborhoods as peole mentioned one of our favorite resources is the living atlas i can connect to it directly in pro and seek out readily available demographic data to supplement my investigation for example if i want to understand parks availability in lower versus higher income neighborhoods i can search for this popular demographics layer in my catalog and add it to my current map here i've represented block groups by the field medium household income and i can also create sub selections from this living atlas data to narrow down my area of focus the charting capabilities in pro help me understand my data in a fundamental way and give me the option to create interactive graphics let me show you what i mean i can create a histogram chart and all i have to do is specify the variable that i want to represent in my chart i can add descriptive statistics such as mean and standard deviation and even visualize the normal distribution now looking at my chart and map at the same time i can interactively select some of these bins that are at the tails of my distribution for example those that represent the highest or those that represent the lowest median household incomes from there i can create sub selections of this data again and have a better understanding of where my lower and upper income neighborhoods are visualizing these areas in conjunction with public transit stations that are currently operating and proposed as well as existing park boundaries helps me ask the right question to continue my descriptive analysis but let's fast forward and imagine that charlotte and mecklenburg county are ready to assess where they can expand the current park system yuri and kim are now going to show us how using ready-to-use deep learning models and the new suitability modeler tool can help charlotte's staff predict the most optimal and equitable locations for new parks nicole thank you so much for that introduction so today as the gis professional in my organization i'm going to talk a little bit about using deep learning to do some land cover classification assessment and so you might be able to apply some of these principles into your own organization for workflows you might already be doing so they're great for automation and deep learning is really good at getting at information in traditional data sets like imagery that you traditionally couldn't get at so with that in mind i just want to jump right in and so we're looking at arcgis pro here and what i've done is i've added this nape image so we've been talking a lot about the living atlas traditionally we've been talking about vector-based data sets right so i'm actually going to switch gears and talk about raster-based data set and this is publicly available through the living atlas and this is just a four-band image service and i'm looking at south mecklenburg county now and what i'm going to do is i'm actually going to train this model directly on this image service which is really awesome so there's no need to download any data i can just consume this as a service and run this analysis directly on it and so with that in mind i'm going to do the majority of the work in a arcgis notebook from within arcgis pro and so the first image there that i had is kind of the workflow that you're going to utilize when you're doing these deep learning processes right and so the first thing we're going to start with is labeling objects that's going to be the first step regardless of of what the workflow that you're you're doing at that point in time is but i'm sure everyone's familiar with labeling objects if you've done any kind of land cover classification before this is kind of that first step and so if we step back into the map view in arcgis pro there are kind of two ways to do this i could select that mape image i could go to imagery and do it through the image classification tools through the label objects for deep learning wizard but what i've done is i went ahead and created a feature class called labeled data this is just circular geometry and if we open the attribute table here we can see that this is just 17 geometries i've created and what's important about them is they have this class value field and that correlates to a value between one and four so the tree canopy that we're really interested in for the land cover that i'm going to pass along to tim for the suitability model is we're really interested in that class value four and that corresponds to tree cover and so my first step would be to create these these training data sets and then i'm going to pass them into a geoprocessing tool and so it's the export training data for deep learning processing tool and what i do there is i point to the nape image here i'm going to specify an output folder this in this case is going to be the spatial analytics folder that i created for this project and then i'm going to point to that label data as the input feature class and i'm going to point to that class value field which is important and i'm going to export these as a classified tile because that's the format that we need i'm not going to run this because it takes a little bit of time to process but i will step back into the catalog window and show you what the output is so i had specified the training folder on a prior run of that tool and what you get are these image chips which are in tiff format and these labels and this is just the labeled data corresponding to those those pixels on the ground that you're that you're interested in extracting or classifying so once we have that step done what we're going to do is we're going to step back into the notebook and i'm going to import the modules and use this prepared data function to get the data prepped and ready to be processed for them from the model and so i just want to caveat this process by saying that even if you're not really great at python or your comfort level doesn't really extend into the into programming this is a seven line script that anyone can reproduce and will actually provide the notebook after the presentation so what i do is i call this prepare data function and that prepares the data by creating an image bunch that i can pass into the model i run this we don't have an output for that and then we're going to step into verifying that the classes present in the training data was captured and if i call this this um function data.classes what i get is yeah i can see that each one of those four classes is present that that has been taken care of and this just data dot show batch shows those data sets or those training samples and we can see that it's actually grabbing those pixels and that looks good as well so the next step would be to actually train the model and then classify those pixels so we're going to be using a model called unit which is just an image segmentation model that's really good at these land cover classification processes and i prepare that here i train the data for unit.fit and that too just means i ran it through the model two times in theory that works well here but in a production environment you would never just run it through twice it's kind of a proof of concept to show the power of the model training we can see that between the two it was less than two minutes training time i save the model and then i can show my results here so once that runs the final step in the process would be to run a geoprocessing tool to classify those pixels because that's why we're here right we want to see what those values are so i point to the input raster which would be that nape image and i'll specify an output location for that for that raster data set and i'm going to point to that model definition so that model definition is created in one of the previous steps runs in the model and what's really awesome about that is it creates a deep learning package that i can share with my colleagues or if i want to revisit this process in the future i can reuse that once that tool runs what i get are two output classified rasters i'm actually going to turn those on and i'm going to zoom to those areas to show you how accurate this model was i was actually pretty amazed at how accurate the model was at predicting those those pixels of interest so we can see that this model does a really awesome job of grabbing those those low land grasses um there's a hedgerow there that it captured and even these mid and high vegetation layers we're getting really good classification for each one of those so it does a really awesome job and if i zoom to another location to kind of show that as well we can see that even in this area that strip along this this side road where the street trees are they grabbed that that stretch of grass just along the sidewalk it's pretty incredible for you know less than two minutes of training a model so once i post process that what i would do is i would share this as an image service which i've done into a federated enterprise environment and then i'm going to pass that service over to tim for using the suitability modeler thanks siri as a data scientist i often leverage the power of spatial analytics to help decision makers understand where opportunities or gaps exist one of my favorite models a suitability model is a workflow that aids in locating the best place to cite something or identify what areas to preserve let's say i'm a data scientist at the city of charlotte and i'm working on a project to identify critical gaps in park equity as we've just learned from nicole there's grounds for more investigating now the question is where are the lease equitable parks my analysis will inform park director decision making around funding staffing and operations we could use a suitability modeler a new capability within arcgis pro 2.7 to analyze several different data layers that will help us understand where the least equitable parts are located these layers include but are not limited to distances from transportation stops crime densities poverty rates noise pollution even pull in data from yuri's ml process that helps display in tree canopy first let's get our project set up a little bit better adjust the windows and change my map extent too as well so we can see the map a little bit better there here's a look at poverty rates in charlotte the map shows the most suitable areas in dark green and the least suitable areas in red on the lower left the transformation pane shows us a histogram of the overall suitability model in the center is a transformation window where we can take the input layer and make adjustments to help with the classification so let's apply a linear function and invert it to better fit our needs there you can see the results on the right which now correctly indicates higher poverty rates will result in lower suitability scores moving on let's add a different map here and two maps so that way we can compare the results as we go with other layers bring that up again on the right we're looking at the overall suitability model and on the left we're looking at the individual layer which i'm going to now select on noise distance represents noise pollution in charlotte we're going to apply a different function here as well by a small adjust the midpoint and invert it now we can see the results on the right as a data scientist actively observing these changes is time saving i don't have to go down the path of running a model waiting for the results then interpreting them i can view the results as i go enabling me to adjust my model instantaneously now on the top right we can explore how these input layers are combined together using a weighted overlay these layers are all equally weighted at the moment but we can change these weights now poverty rates distances from those transportation stops and crime densities are all critical in assessing park equity so let's increase those accordingly so that way our resultant overall suitability model reflects the influence of those layers even more i will note that this tool also enables us to locate specific regions based on area units number of regions region shape an evaluation method like lowest value however for our example we already know where the parks are located and after running the model we now know their suitability scores which looks like this again aries and red are our least equitable areas and areas in green are our most equitable areas we have one more step to make here which is to extract these values to the park polygons that you see here highlighted in the in green so each of these green polygons represents a park we really need those parks to have their resultant raster value so i have a python script that'll do just that and if i run it you'll see a new layer gets added to the map called suitability scores and if i select on one of these parks you'll see that a new column called raster value has been added in this example this park has a raster value of 134 which represents a pretty high suitability score now that all of our parks have those scores added to their attribute table we're good to share this as a web layer back to our arcgis online organization so that way we can begin the prescriptive analysis process by visualizing the results placing them into context within an insights workbook like this one by displaying the analysis in a workbook like this staff and decision makers alike can bring analytics and data into their decision-making process now i've went ahead and enriched this park layer for demographic information from our living atlas you can see on the right to get understanding of who our parks are servicing and which parks have more social vulnerability i've also added a couple of maps here on the left which are completely interactive so the top left map shows all the scores the larger the circle the higher suitability score in the bottom left map i filtered the data just to show the 10 lowest scores and added a distance buffer to help visualize where the least equitable parks are located and their service areas in this table which i've sorted ascendingly by their suitability scores again the lower the score the least equitable for our analysis and all of these processes that you see here are recorded in a preserved analysis workflow so that way the city can run this process year to year or share with other stakeholders an initiative around understanding the community's priorities with this workbook a decision maker can now have this supporting information to take actions as you can see the suitability modeler has completely simplified the way we do suitability modeling and we've leveraged this powerful tool in finding our least equitable parks in charlotte streamlining our ability to analyze and interpret several layers all at once we've also shown how arcgis insights can aid in data visualization and summarize our results place it into context so we can share our findings and preserve our workflows to other key stakeholders and decision makers thank you now i'll pass it over to robert thank you tim as you can see here tim has completed his analysis using arcgis insights because arcgis insights is delivered in a sas model tim can share his work externally to the public internal to his organization he can even share with other members external from his organization as you may recall during my previous demonstration we witnessed that the very need in our vaccination scenario to collaborate collaboration just like this occurs daily within our within the arcgis ecosystem we see this real-time sharing in local government aec firms state agencies academia commercial use and many more to wrap things up i'm turning this over to dan dan thanks again rob thanks pialy nicole yuri and tim for stepping us through those demonstrations so the demonstrations you just saw really helped to illustrate what we refer to as the building blocks of spatial data science so beginning with piali's overview of tools and capabilities for data engineering visualization and exploration and her analysis of the equitable distribution of parks and green space then nicole focused on using tools for spatial analysis in arcgis pro looking at available green space in charlotte and then yuri leveraged machine learning and artificial intelligence capabilities to do predictive analysis around parks placement tim then showed showed us spatial analytics and modeling using new suitability modeler to take another look at the equity equity of parks and he shared those results using insights to bring the analytics to light we also saw the use of arcade python and arcgis notebooks to support the analytic workflows and if you remember back at the beginning rob showed a vaccine distribution solution that really brought all of these capabilities together to address a very timely problem now we didn't touch on big data analytics during this webinar but at as i mentioned at the beginning that is a topic that we're going to seek to cover during a future session hopefully we provided some useful guidance for all levels of gis professionals listening listening today as you work to improve your skills around spatial analytics now we did want to leave you with a few helpful resources to get started investigating and learning more about this topic with spatial data science the python api and arcgis notebooks as well as the living atlas and please note the spatial analysis learn lessons and the esri moocs for additional educational opportunities and remember this webinar is being recorded so you don't need to scramble and write these down now and just a reminder we'd like you to join us for the next webinar in the state and local connect series on february 17th we'll be presenting what every gis professional needs to know about artificial intelligence and machine learning please remember to register for that webinar at go.esri.com forward slash state-local connect and i did want to mention again that all of us on this call are part of a broader state and local government team here at sre a team that's committed to help you achieve your organization's goals although we may not be able to meet with you in person right now we are very much here for you and i think we do have a little bit of time remaining for questions so we'd like to try to answer a few of those right now um nicole i think you've been keeping an eye on some of the questions that were coming in yeah thank you everyone for bearing with us through some of the audio issues early on one of the questions i got that i think piali would be able to answer well is how do you get access to esri demographics thanks nicolia that's a great question there are a couple of different ways to get access to esri demographics depending on what products you use and how you want to actually access it so esri demographics are actually available across the arcgis platform and you can view all the products that are that incorporate esri demographics directly on the esri demographics documentation site but one of the easiest and most flexible ways to access esri demographics is through the geo enrichment tool as i showed in my demonstration it's also often referred to as enriched layer and you can find gui versions of that tool in arcgis online arcgis enterprise and arcgis pro the tool basically allows you to apply those chosen demographic data points to your own data including point line and polygon data and you can you can also access geo enrichment programmatically through the arcgis python api and essentially arcgis notebooks which are available online enterprise and pro and even without utilizing the tool you can browse esri demographics through the esri data browser you can get access to that as well through the esri demographics product and documentation page as well so you can basically just look through the categories um to to see what sort of variables there are and in my demo i showed how you can also select those data points and export them to csv or json for a script thank you so much you actually touched on another question which was asked namely how do i get access to notebooks so piali you said your options are in pro online or enterprise but yuri can you add any more commentary on notebook accessibility sure yeah great question so last year um sql i mentioned notebooks became available across the platform and so the demonstration i did everything was in arcgis pro so that makes use of all the libraries and the arcpie module and everything that you have installed locally on your client machine but they're also available if you have an rts online organization account they're available to you currently with the standard runtime and then if you have an enterprise environment we have a server role notebook server which allows you to have access to those in the enterprise environment so but you can you can choose whichever flavor you want and kind of start building those models from there thanks that adds a lot of clarity um dan can we take one or two more yeah maybe just one more okay um this one's for tim what is the difference between the suitability modeler flash widget and web app builder and geoplanner versus the suitability and model modeler in pro 2.7 great question i would say from my experience in working in both the suitability modeler in pro is a lot more flexible there's more of like an exploratory and interactive nature with the transformation pane adding your own input layers and adjusting the parameters users are also unable to locate specific regions unlike the widget in the web app builder and geoplanner in terms of the technology too you will need a spatial analyst extension license to access the suitability modeler in pro 2.7 and if you want to customize your model in the widget and web app builder we'll leverage um the weighted overlay service which will need rts server so those are the differences um with the technology requirements thanks tim and yuri pialy for your answers hey thanks for feeling those questions nicole so if there were any questions we didn't get to hear live we will do our best to follow up afterwards and i just want to thank all of the presenters again today and thanks to all of yous you who attended and we do hope to see you in person sometime soon have a great rest of your day thank you thank you you
Info
Channel: Esri Industries
Views: 822
Rating: 5 out of 5
Keywords: Esri, ArcGIS, GIS, Geographic Information System, ArcGIS Pro, location, spatial analysis, spatial analytics, dashboard, analytics, analysis, pattern detection, object detection, predictive analysis
Id: IQO8hdbgjSc
Channel Id: undefined
Length: 52min 42sec (3162 seconds)
Published: Mon Jan 25 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.