SAS Tutorial | Getting Started with SAS Studio on SAS Viya

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone my name is luna bozeman and i'm a technical trainer at sas in this tutorial we'll take a look at sas studio on satisfya and explore some of the great features that you can take advantage of we'll first look at some of the programming features that are available to help make writing code a little easier but you don't need to know how to write sas code to use sas studio you can use point-and-click steps and tasks that generate sas code for you we'll use steps to import a file and to query data then we'll use a task to generate a map chart all the data preparation and analysis that we do will be added to a flow which is a visual sequence of operations on data before we start i'd like to point out that i'll be using sas studio on sas via 2020.1.3 but you should be able to follow along if you have sas via 2020.1 or later keep in mind though that satisfya is updated continuously so you can expect to see even more features and enhancements in future releases i also have a license for sas studio analyst which provides additional features you won't need this license to follow along but i'll point out the additional features you may have access to if you do let me show you how you can quickly find this information once you sign into sas studio all you need to do is click on the user button at the end of the application toolbar and then select about if your site has a license for sas studio analyst you'll see that reflected in the product name you can find your satisfied version next to version now we're ready to get started i'll be using data that's readily available in the sas help library but i'll also use one starter program and csv file if you'd like to follow along with me you can download those materials using the link in the description below let's jump in let's start by taking a look at some of the programming features that are available in sas studio on satisfya so i've signed into sas studio and notice that the main windows studio consists of a navigation pane on the left and a work area on the right so the navigation pane provides easy access to your open files your folder shortcuts file system and sas content steps tasks snippets the libraries you have access to your git repositories and your file references now the sections that you see in the navigation pane will be dependent on settings set by your administrator as well as what you've selected in the view menu we'll explore these sections as we take a look at sound studio's features the work area is used to display your data code tasks logs results and flows now as you open these items they're added to the work area as windows in a tabbed interface when you first open sas studio you'll notice that by default the start page tab is open so that you can quickly get started writing a new sas program building a flow importing data or creating a query so back in the navigation pane let's start with the explore section the explore section enables you to access files and folders from your folder shortcuts your server file system and your sas content server locations if available i have a sas program named earthquake depth categories so i'll go ahead and double click on it to open it in a new tab in the work area the goal of this program is to categorize earthquakes in the quakes table from the sas health library by the depth at which the earthquake occurred so if the earthquake occur less than 70 kilometers from the surface of the earth it's considered a shallow earthquake less than 300 kilometers an intermediate earthquake and less than 700 kilometers a deep earthquake also the magnitude values are rounded to the nearest decimal point the results will be stored in the earthquakes table and the work library let's see what the results look like on the program toolbar i'll click on the run button but you can also press the f3 key on your keyboard to run the program so notice that the default tab layout is a vertical split where the code tab is on the left and in the log and if applicable the results and output data tabs are in a tab group on the right to change this at the end of the program toolbar select the more options three vertical dots then tab layout i'll select single to group the tabs into a single tab group but feel free to choose whichever tab layout you'd like so now let's take a look at the output data tab to take a look at the output earthquakes table now you'll notice that the new depth cat column has a value of deep for what seems like every earthquake even those earthquakes that occurred less than 70 kilometers from the surface of the earth now let's take a look at the log tab to see if there is any information that could help us understand why the results aren't what we are expecting the log allows us to view messages returned from sas at the top is a log summary where we can quickly view any errors warnings or notes that were generated in addition to the errors and warnings section i'll also select the notes section so we can see all messages in the log summary if you'd like to see the corresponding full message in the log you can select the message in the log summary to quickly jump to it there are only three notes in the log with no errors or warnings and none of these messages seem to indicate a syntactical error in the program so this is an example of what is called a logic error an error that doesn't stop the program from running but produces unexpected results like we saw a great tool that you can take advantage of in sas studio is the data step debugger and it'll enable us to step through the execution of a data step to locate where that logic error is coming from let's try it out so going back to the code tab on the program toolbar i'll select debug to enable the data step debugger now all sections of data sub code in the program are highlighted with the green bar and the margin to indicate that they can be debugged to start the debugger for this data step simply click on the bug data step markers for debugging icon in the dataset debugger window you'll notice that the code is on the left and the currently executing line is highlighted in purple the list of columns and the current values are displayed on the right i can use the data set debugger to see how the data is processed statement by statement row by row let's start by executing just a highlighted set statement i'll click on the step execution for next line button and the first row from the quakes table is read in and the values are displayed to the right any changes in the column values are displayed in red making it easy to see what's changed you might have noticed that the link statement was skipped and that's because it's a compile time only statement which sets the length of depth cat to 12 just to make sure the values don't get truncated now notice that the current value of magnitude is 2.75 so clicking on the step execution for next line button again the assignment statement rounds the magnitude value to the nearest decimal point overriding magnitude with a value of 2.8 the current value of depth is 6.25 which is less than 70 meaning that the condition on the first if then statement is true i'll click the step execution for next line button twice to execute the statement and notice that deathcat now has a value of shallow now the second if then statement is highlighted and again since the current value of depth is 6.25 which is also less than 300 the condition is true so executing the statement defcat is overwritten as intermediate and similarly 6.25 is less than 700 so the condition on the last if then statement is true so executing the statement that cat ends with a value of deep for this earthquake so why did this happen when you have multiple if then statements sas tests all conditions in sequence for every row the last true condition executes the statement that determines the value in the output table so in this example this means that any earthquake with the depth value less than or equal to 700 is assigned a depth cap value of deep instead the conditions were intended to be treated as a hierarchy so that when a true condition is found sas executes the statement following the then keyword and skips the subsequent if then statement to enforce this sort of sequential testing the else keyword can be used in front of the if then statement aside from the first one let's make this change i'll close out of the data step debugger window and back in the program i'll add the else keyword in front of the second and third if then statements now let's go back into the debugger to see the effect of this change as i did before i'll execute the set statement to read in the first row as well as the assignment statement to round the magnitude value to the nearest decimal point again the current death value is 6.25 which is less than 70. we've hit a true condition so executing the statement assigns depthcat a value of shallow the main difference now is that because a true condition was met and else keyword is used with the remaining conditional processing statement those statements are skipped leaving depcat as shallow you can continue to use the dataset debugger to process all rows from the quakes table but since we've identified and now fixed the problem i'll close out of the debugger on the program toolbar i'll select the debug button again to suppress the debugger icon and the green bar in the margin then rerun the program taking a look at the output data tab again the results look as expected it may seem like every earthquake has a depth cat value of shallow now but let's verify that i'll right click on the depth column and select sort then descending to sort the view of the table although none of the earthquakes fall under the deep category there are several earthquakes with depth values between 70 and 300 kilometers with a depth cap value of intermediate keep in mind that sorting the table in the table viewer doesn't change the sort order of the table it simply sorts the current view of the table any customizations like sorting or filtering that you apply in the table viewer won't be saved with the table to remove the sort i'll right click the depth column again and select sort then remove sort with the earthquakes categorized by depth back on the code tab i'd like to add a proc freak step to create one-way frequency reports to do this i'll take advantage of auto-complete so at the end of the program i'll begin typing pr notice that autocomplete appears with a list of suggested keywords and proc is highlighted the syntax help also appears corresponding to the highlighted keyword with a description of the keyword as well as links to the product documentation samples and sas notes and papers you can also hover over any blue keyword in a program to see the syntax help for that corresponding keyword which is especially helpful when looking at someone else's code that may include procedures or options you aren't familiar with to add the product keyword in the program i'll double click on it in the autocomplete window hitting the spacebar the autocomplete window appears again this time with a list of procedure names i'll type fr then hit the enter key to add the highlighted free keyword to the program hitting the space bar again autocomplete provides a list of valid options for a proc freak statement i'll type a d then hit the enter key to enter the data equals option to specify the input table for the analysis autocomplete displays the output tables referenced in the program so i'll double click on work dot earthquakes which is the output table from the previous data step pressing the space bar again the list of options for the proc break statement appear again i'll type an n then hit the enter key to enter the end levels option to include a table with a number of distinct values for the columns to be analyzed that's it for the proc freak statement so i'll type a semicolon to end the statement and move on to the next line in the program to specify the columns to analyze i need to use a table statement so i'll type t a and when the autocomplete window appears i'll use the down arrow key to highlight tables and enter it in entering in a space instead of directly typing in the column names i can take advantage of the library section in the navigation pane to drag and drop column names in the library section i can see the list of libraries that i have access to the earthquakes table is in the work library so i'll expand work then earthquakes to see the list of columns in the table i want to analyze depth cat and type so i'll select depth cap hold down the control key select type then drag and drop the columns into the program to add them to the table statement this feature is especially helpful when you need to include many column names in your program now you might notice that type is enclosed in quotation marks and followed by the letter n this is known as a sas name literal when you drag a column or table name that is also a reserved word in a database like type or a name that doesn't conform to traditional sas naming conventions sas studio will automatically write the column name as a sas name literal for you to ensure that the name is correctly evaluated by the program to add options to a table statement i'll type a space a forward slash then a space again autocomplete provides a list of options for a table statement i'll type an n to filter the list of options then double-click on the no queue option to suppress the display of cumulative frequencies and percentages in the generated report i'll end a statement with a semicolon and on the next line type in a run statement to end the proc freak step using autocomplete is a great way to quickly write code and to learn about options and keywords that you can use to quickly improve the formatting of the code on the program toolbar i'll click on the format code button looks great since i already ran the data step i only need to run the proc freak step this time to run just a portion of a program highlight the portion you want to run so i'll highlight the proc freak step then click run or use the f3 key shortcut and the results tab displays the one-way frequency report we can see that there are two unique values of depth cat and six unique values of type it looks like most earthquakes in the data are shallow earthquakes and while there are several different type values like explosion and landslide most have a value of earthquake at this point it's important to mention that sas via includes multiple servers to execute sas code the two primary being the sas compute server and sas cloud analytics services or cads the code that has been submitted in this tutorial is traditional sas 9 code that you may be used to and was executed on the sas compute server by default there is no need to learn any new syntax with the compute server kaz on the other hand is the high performance server that performs parallel processing on in-memory data and you'll likely use it for bid data and complex analytics usually only very minor code modifications are required to run programs in cads now in this tutorial all code will be executed on the compute server but just keep in mind that caz is available you can check out the documentation for more information about executing programs and cads you've seen several great programming features that you can take advantage of in sas studio but you can modify or turn off some of these features if you'd like to do this on the main toolbar i'll go to options then preferences you can set many options to customize sas studio in the preferences window but let me point out a couple that relate to the features that we've seen under sas programs if you select code and log towards the bottom you can use the program tab layout option to change the default tab layout for all sas programs not just the active program on the editors page if you select editor options you can change editor options like turning off autocomplete and changing the font of sas programs i won't make any changes so i'll select cancel then cancel again but know that you have these customization options available to you finally to save the program under a different name on the program toolbar i'll select the save as button i'll navigate to and select the folder i'd like to save the program to then change the program name to earthquake category frequency in the type drop down list notice that you can choose to save the program the log or the results or create a program summary page or a sas program package file a program summary page includes information about the program execution the complete sas source code the complete sas log and the results a sas program package contains a snapshot of a sas program along with its log and html results i'll use the default program type and save the program now let's talk about flows in sas studio as i mentioned before a flow is a visual sequence of operations on data data and operations are represented by nodes which can be connected to specify the order of execution there are different types of steps that you can add to a flow as nodes like sas programs and queries let's see how we can use a flow to prepare and analyze earthquake data i'd like to add the code from the previously saved earthquake category frequency sas program to a flow as a starting point i could create a new flow and directly add the program as a sas program node but an alternative is to convert the program to a flow in which the input tables individual procedures and output tables in the program are broken up to create individual nodes in the flow before i do that though i'll start a new sas session to reset options and clear out temporary tables and files a really easy way to do this is to select options on the main toolbar select reset sas session then reset now let's convert the program to a flow at the end of the program toolbar for the earthquake category frequency program i'll click on the more options three vertical dots then select create flow from program i want to change the name of the flow so i'll click the open icon i'll save in the same folder as the program and name the flow earthquake analysis i'll click ok then ok again behind the scenes sas runs the program and a new flow tab earthquake analysis opens in the work area let's take a look at this new flow the flow illustrates that the quakes table is used as input to the data step to create the earthquakes table then a proc freak step is used to analyze the earthquakes table the data step and freak nodes are program nodes containing portions of the original earthquake category frequency program so if i select a freak node for example in a node details below the flow canvas you'll see that just a proc freak portion of the code is stored in this node keep in mind that if an existing sas program is added to a flow a copy of the code is added to the flow meaning that any changes made to the original sas program won't affect the code in the flow and vice versa the quakes and earthquakes nodes are table nodes representing sas tables selecting the earthquakes node you can see some of the properties of the table and even preview the data let's expand this flow i want to further enhance the earthquakes table by joining or combining it with a lookup table that maps the individual magnitude values to descriptive classes this lookup table is stored in a csv file and is stored locally on my computer so the first step is to upload it to the server before i can actually use it in the flow so in the explore section i'm going to navigate to then select the folder i want to upload the file to then on the explorer toolbar click on the upload files button i'll click on the add button select the magnitude classes.csv file then click open then upload with the file uploaded i'll drag the file from the explorer section onto the flow canvas to add it as a file node now i need to import this file before joining it with the earthquakes table there are several ways that i can start an import in the steps section of the navigation pane you can view the steps that can be added to a flow as nodes including the import step if your site doesn't have a license for sas studio analyst you won't have access to all of these steps but as i mentioned before in this tutorial we'll only use steps that don't require this additional license so to import a csc file i can drag the import step towards the right portion of the magnitude classes file node and drop it when connect to output port appears alternatively you can simply right-click the magnitude classes file node and select add an import either method will connect the file to the input port of a new import node there's a red unfinished state icon on the import node which means that we'll need to set some options before the import node can be run without errors with the import node selected let's specify these options in the node details i'll click on the maximize preview button so that i have a little more room to work with on the options tab i'll first click on the view raw file button so that we can understand the structure of the file before specifying options this is a comma delimited file and the first record contains what can be used as column names the first field contains the magnitude lower bound followed by the magnitude upper bound then the last field contains the corresponding class description for earthquakes in the specified magnitude range so for example earthquakes with magnitude values between 5 and 5.9 are considered to be moderate earthquakes i'll click close then select options to take a look at the import options the file was recognized as a csv file and the values in the first row are automatically used as column names though we can change these column attributes before importing the rename column names to comply with sas naming conventions option is selected so that the spaces in the column names get replaced with underscores the first row containing the data values is the second row so i don't need to make any changes on the analysis options tab for this import taking a quick look at the update options tab i also don't need to make any changes here but know that you can customize the import i'll click ok with the options good to go i'll click analyze to identify the structure of the file now the three columns are listed and because of the import options the spaces that were in the column names are replaced with underscores there's also a preview of the output data before importing a file you can make changes to the column attributes i'll just change the column names of magnitude lower bound to magnitude underscore low magnitude upper bound to magnitude underscore high and class to magnitude underscore class and i'll leave all of the other attributes at their default values i'll click update to see an updated preview of the data looks good next on the node tab i'll change the name of the node to import classes that's it for the import so i'll click on the minimize preview button to bring back the flow canvas then to run the entire flow click run on the flow toolbar the green check marks indicate success notice on the import classes note that the output port is now filled in meaning that the node ran successfully and that data is available from the output port looking at the generated code tab i'll click refresh to view the generated code for all nodes the submission tab shows the submitted code and resulting log results and output data we can see the imported table on the output data tab you may need to use a drop-down list to select to import a table to view it but notice the name of the imported table because a table node was not connected to the output port of the import classes node a temporary table was created in the work library with a default name we'll see how to specify the name and location of the output table with an upcoming example back on the flow tab with the lookup table imported we're now ready to join the earthquakes table with a lookup table and we'll do this with a query you can add a query step from the step section in the navigation pane but an alternative which is what i'll do is to right-click the earthquakes table node and select add a query this connects a table to an input port of a new query node to connect the imported lookup table to the new query node you can select and hold the output port of the import classes node then drag it towards the query node when a second input port appears on the query node you can place your cursor on the second input port and release to connect them you can also right-click the query node and select add input port before connecting them this time i'd like to specify a name and location for the resulting output table so i'll right click the output port of the query node and select add a table to add a table node with the table node selected i'll restore the preview of the node details on the table properties tab i'll click on the library icon i'll select the work library as a location and name the table earthquakes underscore class i'll click ok now let's go back to the query node i'll maximize a preview and you'll notice that both tables are listed in the columns area since both tables are input to the query all columns from both tables can be used in this query the table alias names or nicknames t1 and t2 might be different depending on the order in which the tables were added to the query and that's okay let's start by taking a look at the join tab the default join type is an inner join which returns only the subset of rows from the first table earthquakes that matches rows from the second table the imported lookup table i'll stick with the default join type sas automatically attempts to join the tables by matching columns with the same name and type if there are no matching columns then the tables are joined using the first column from each table so you'll notice that the default join condition is latitude equal to magnitude low instead rows from the two tables should be matched based on the magnitude range defined by magnitude low and magnitude high if the magnitude value of the earthquake falls between the range then it's considered a match and the corresponding rows from each table are joined into a single row this requires using two join conditions so on the first join condition i'll click on the column button to the left of the operator select magnitude and click ok i want the magnitude value to be greater than or equal to magnitude low so i'll leave that column selected to add a second join condition i'll click on the add a condition button next the first join condition if you don't see it you may need to hover over the first joint condition for it to appear the second joint condition will be magnitude is less than or equal to magnitude high now that the join conditions are set let's go back to the select tab columns need to be added to the select tab in order for them to be included in the output table i want to include all columns from the earthquakes table so quick trick is to drag t1 earthquakes onto the select tab you can also double-click the table name instead of adding the magnitude class column from the imported lookup table i want to create an enhanced version that lists the magnitude class description the magnitude lower bound and the magnitude upper bound all together so to create this new column in the columns area i'll click calculated column the expression for this new column can be directly typed into the expression area or can be built by selecting functions columns and operators the new column will concatenate different column values into one string so on the functions tab i'll expand the character folder and double click on the cat x function to add the function to the expression the cadets function concatenates strings and inserts limiters between each string the delimiter is specified in the first argument and the strings to concatenate follow so with string one highlighted i'll type quote space quote to replace the first argument and specify a blank as the delimiter next i'll highlight string 2 then on the data tab double click on magnitude class under the imported lookup table to replace the second argument i'll type a comma but you can also select it then type a colon in quotes after a comma i'll double click on magnet to low from the lookup table another comma type 2 in quote comma then double click on magnitude high that's it for the expression so moving on to the properties tab down at the bottom i'll name this new column magnitude underscore description i won't specify any of the other column attributes but notice the add new calculated column to the select tab checkbox with this selected i'll click save and notice that the new column is added to the end of the select tab the order of the columns on the select tab determines the order of the columns in the output table so i'll drag the new column to directly after magnitude you can also right-click a column to move it up or down or to the top or bottom of the list with the columns we want added let's move on to the filter tab i only want to include rows with a type value of earthquake so in the columns area i'll expand the earthquakes table then double-click on type then next to type on the filter tab i'll click on the set a filter on a column button i'll use a default condition of equal 2 and i can directly type in a value in the value box but instead i can retrieve a value from the column so to do that next to the value box i'll click on the lookup value button then click get values to see a list of type values in the earthquakes table from the list i'll select earthquake and click ok with the match case checkbox cleared the filter will be case insensitive so i'll leave that cleared i'll leave the quote strings checkbox selected and click filter finally moving on to the node tab i'll name the node join classes that's all i need to specify for the query so i'll restore the preview this time instead of running the entire flow i'll run just the query to do that i'll select the join classes node then click on the run a single selected node button on the flow toolbar you can also right-click a node and select run node 2. the node ran successfully so let's take a look at the output table earthquakes class i'll select it then in the node details select the preview data tab to easily view more rows and columns i'll maximize the preview but also in the upper right corner of the navigation pane click on the hide pane button to hide the navigation pane looking at the new magnitude description column we can see that it lists the magnitude class obtained from the imported lookup table as well as the corresponding magnitude range looks great on the flow toolbar i'll click on the save icon to save the flow finally let's take a look at using tasks in sas studio sas studio has several features to help generate sas code snippets and tasks snippets are lines of commonly used code or text that you can save and reuse sas studio is shipped with several code snippets that you can find under the snippets section in the navigation pane tasks on the other hand are point-and-click interfaces to sas procedures unlike steps tasks can't be added directly to a flow but the code generated by the task can be copied into a flow so let's use a task to create a geographical map of earthquake locations then add the code to our flow in the task section of the navigation pane i'll expand visualize data then map there are several mapping tasks to choose from but i'll double click on the bubble map task as i did before in the upper right corner of the navigation pane i'll click on the hide pane button to hide the navigation pane the first step is to define the input data for the task i'd like to use the earthquakes class table that was generated from the query and the flow from earlier so next to the data box i'll click on the select a table folder icon and the table is stored in the work library so i'll select that library select the earthquakes class table and click ok i'd also like to apply a filter to the table to include only earthquakes with a magnitude of 5 or higher in the map so under the table i'll click on filter to build a filter expression on the data tab i'll double click on magnitude to include it in the expression then i'll select the is greater than or equal to button then type a 5. that's it for the filter expression so i'll click save now in the task console window you'll notice several messages these messages indicate that some of the roles require a column to be assigned to them these roles determine how columns are used in the task so let's go ahead and do that to assign a column from the input table to a rule simply click on the select columns plus sign button next to the role to the latitude roll i'll assign the latitude column containing the latitude values and similarly to the longitude rule assign the longitude column i want the bubble size to represent the magnitude value so i'll make that assignment and then finally i'd like each magnitude class or description to be represented by a different color so to the group role i'll assign magnitude description scrolling down a little i'll use the default base map layer of open street map if you'd like you can select esri map to specify a url to a specific esri map you'd like to use as a side note you might have noticed that as you make changes to the task options the code on the code tab is automatically updated that's it for the data tab so let's move on to the appearance tab under the legends section i'll leave the generate plot legend checkbox selected and label it as magnitude class under the plot section i'll increase the transparency to about .63 next i'll expand the title and footnote section and specify a title of magnitude 5 or greater earthquakes i'll also increase the font size of the title to 15. finally i'll expand the graph size section and increase the width of the graph to eight inches and the height to six inches let's run the task and take a look at the results and here's our map again the bubble size represents the magnitude and the colors indicate the class which is explained in the legend we can see that many magnitude 5 or greater earthquakes have occurred along the gulf of california to save the options and settings specified in the task i'll click on the save as button navigate to a folder of your choice name the file magnitude 5 plus map and then click save now let's copy the task generated code to the earthquake analysis flow all i need to do is select the code to flow button on the task toolbar then select the flow i want to add the code to now going back to the earthquake analysis flow tab you can see that the task code was added as a program node to the flow since the earthquakes class table was used as input to create the map i'd like to connect the table to the magnitude 5 plus map node this will also control the order of execution when running the flow to make this a little easier i'll first drag the magnitude 5 map node next to the earthquake's class node then i'll hold the right edge of the earthquakes class table node and drag it to connect the arrow to the input port of the magnitude 5 map program node a quick tip if you want to quickly optimize the layout of the flow you can click on the arrange nodes button on the flow toolbar perfect looking at the magnitude 5 plus map node details specifically the node tab notice the input ports and macro variables and output ports and macro variables sections a macro variable stores text that is substituted in your code when it is run you can specify names for the macro variable storing the table names connected to the input and output ports of a program known then use those macro variables in the code this makes it easy to reuse your flow with other tables so if other tables have the same structure as the current table using these macro variables would enable you to simply swap out the table in the flow without making any changes or adjustments to the code let's try using them specifically the input port macro variable i'll keep the default name of underscore input 1. back on the code tab on the proc sgmap statement the plot data equals option references the work the earthquakes class table so i'll replace the table name with the macro variable macro variables are referenced by preceding the macro variable name with an ampersand so i'll type ampersand underscore input 1. when the code is run sas will replace ampersand underscore input 1 with the name of the table connected at the input port work.earthquakes class i'll go ahead and minimize the preview as we saw earlier we can run the entire flow simply by clicking on the run button but i'd like to show you another feature that you can take advantage of the background submit feature the background submit feature enables you to run sas programs queries tasks or flows in the background while you continue to use sas studio this is especially great for long-running jobs to run this flow in the background i'll click on the submit the flow in the background using another session button on the flow toolbar you can also right-click an item in the explore section of the navigation pane and select background submit to submit it without opening it now let's look at the submission status window i'll select it from the view menu to open it at the bottom of the list is the most recent submission of the flow and the green check indicates that it ran successfully if you'd like you can select an entry in this window to return to an earlier version of a program task query or flow and view the associated log i'll go ahead and close the submission status window then save the flow one last time to wrap up i'd like to close out all of the tabs except for the start page tab so a quick trick is to right-click the tab you want to keep open then select close others whether you're a sas programmer or prefer to use point-and-click methods to analyze your data i hope this gave you a glimpse at some of the features that are available in sas studio on satisfya and as i mentioned before sas via is updated continuously so be on the lookout for even more features and enhancements in future releases thanks for watching you
Info
Channel: SAS Users
Views: 4,384
Rating: 5 out of 5
Keywords: getting started with sas studio, sas studio, sas viya, sas studio on sas viya, sas how to tutorial, working with flows, sas, sas cloud, working with flows in sas studio, working with flows in sas viya
Id: 44xg6-sUyvU
Channel Id: undefined
Length: 44min 29sec (2669 seconds)
Published: Mon May 17 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.