Streamline your sales forecasting process with airSlate SignNow's automation for engineering

Effortlessly optimize sales forecasts for Engineering with airSlate SignNow's user-friendly solution. Experience great ROI, easy scalability, and superior support.

airSlate SignNow regularly wins awards for ease of use and setup

See airSlate SignNow eSignatures in action

Create secure and intuitive e-signature workflows on any device, track the status of documents right in your account, build online fillable forms – all within a single solution.

Collect signatures
24x
faster
Reduce costs by
$30
per document
Save up to
40h
per employee / month

Our user reviews speak for themselves

illustrations persone
Kodi-Marie Evans
Director of NetSuite Operations at Xerox
airSlate SignNow provides us with the flexibility needed to get the right signatures on the right documents, in the right formats, based on our integration with NetSuite.
illustrations reviews slider
illustrations persone
Samantha Jo
Enterprise Client Partner at Yelp
airSlate SignNow has made life easier for me. It has been huge to have the ability to sign contracts on-the-go! It is now less stressful to get things done efficiently and promptly.
illustrations reviews slider
illustrations persone
Megan Bond
Digital marketing management at Electrolux
This software has added to our business value. I have got rid of the repetitive tasks. I am capable of creating the mobile native web forms. Now I can easily make payment contracts through a fair channel and their management is very easy.
illustrations reviews slider
Walmart
ExxonMobil
Apple
Comcast
Facebook
FedEx
be ready to get more

Why choose airSlate SignNow

  • Free 7-day trial. Choose the plan you need and try it risk-free.
  • Honest pricing for full-featured plans. airSlate SignNow offers subscription plans with no overages or hidden fees at renewal.
  • Enterprise-grade security. airSlate SignNow helps you comply with global security standards.
illustrations signature

Sales forecast automation for engineering

Are you looking to streamline your sales forecasting process in the engineering industry? airSlate SignNow offers a solution that allows you to automate your sales forecast with ease. By utilizing airSlate SignNow, you can save time and improve efficiency in your business operations.

sales forecast automation for engineering

Experience the benefits of airSlate SignNow for your engineering sales forecasting needs. With its user-friendly interface and powerful features, airSlate SignNow can help you streamline your processes and improve productivity. Try airSlate SignNow today and see how it can transform your sales forecast automation.

Get started with airSlate SignNow and revolutionize your sales forecast automation for engineering today.

airSlate SignNow features that users love

Speed up your paper-based processes with an easy-to-use eSignature solution.

Edit PDFs
online
Generate templates of your most used documents for signing and completion.
Create a signing link
Share a document via a link without the need to add recipient emails.
Assign roles to signers
Organize complex signing workflows by adding multiple signers and assigning roles.
Create a document template
Create teams to collaborate on documents and templates in real time.
Add Signature fields
Get accurate signatures exactly where you need them using signature fields.
Archive documents in bulk
Save time by archiving multiple documents at once.
be ready to get more

Get legally-binding signatures now!

FAQs online signature

Here is a list of the most common customer questions. If you can’t find an answer to your question, please don’t hesitate to reach out to us.

Need help? Contact support

Trusted e-signature solution — what our customers are saying

Explore how the airSlate SignNow e-signature platform helps businesses succeed. Hear from real users and what they like most about electronic signing.

This service is really great! It has helped...
5
anonymous

This service is really great! It has helped us enormously by ensuring we are fully covered in our agreements. We are on a 100% for collecting on our jobs, from a previous 60-70%. I recommend this to everyone.

Read full review
I've been using airSlate SignNow for years (since it...
5
Susan S

I've been using airSlate SignNow for years (since it was CudaSign). I started using airSlate SignNow for real estate as it was easier for my clients to use. I now use it in my business for employement and onboarding docs.

Read full review
Everything has been great, really easy to incorporate...
5
Liam R

Everything has been great, really easy to incorporate into my business. And the clients who have used your software so far have said it is very easy to complete the necessary signatures.

Read full review
video background

How to create outlook signature

welcome everyone to mastering sales forecasting where we're going to learn how to take the guesswork out of forecasting with predictive AI I'm Natasha from the marketing team at accio and we're joined here with Jonathan Reilly and Elvis savaria who are going to be leading us through a discussion and showing us two different examples of forecasting both through sales and website traffic so I'm going to hand it off to them to get started nice to have everyone here and I'm excited to have a discussion with Elvis today we're going to be talking about uh time series uh forecasting right working with time is like a really important thing for pretty much all modeling that you might do because time is sort of a constant across all businesses and all data streams um and uh as as an ecosystem or system evolves uh it changes and so one of the most important things to model when you're doing things like forecasting sales or um you know support uh Staffing or pretty much anything inventory is to be able to sort of like take that time the variable into into consideration but traditionally that's been pretty hard uh to do um you know historically in my experience working in businesses uh it's primarily been done using Excel sheets where you build these really complicated like Revenue forecasts and you try and make these models that have some assumptions or relationships between inputs and outputs and usually this process takes um it just like just like weeks and weeks to put together a competent forecast or something and then you know you have to redo that forecast uh you know every so often as uh as your environment changes and you get more information so um historically it's been it's been really really hard to work with time as an input variable and today we're going to show you um how uh how some new generation of tooling uh in the accio platform in this case uh makes it pretty easy to work with time as a variable um using machine learning and some time series modeling so that you can start to build your own forecasts without having to go through all of that pain that comes along with the look like at the Excel approach um so I'm excited to have Elvis tell us a little bit about um you know some some of like uh how ml works with uh with forecasting uh he's going to walk us through an example um in accio and then I'll cover a couple of examples and we'll have a discussion and take questions thank you John yes maybe do a little bit introduction here um yeah I'm quite excited for today I think um forecasting is one of those things um you know that's very common in the industry it has um forecasting can be used for just you know pretty much any type of application you can imagine there's a lot of like time series data um my background is more you know around language models I've I've worked a lot with like deep learning type of models as well and one of the use cases in use cases I work with a lot is like managing logs right and and managing traffic and this kind of data right that obviously has this time series uh pen component and and you know a lot of a lot of it can be you can create a lot of value from it but you you one of the things that's really important is like um how do you provision how do you you know manage your traffic how do you um you know predict or forecast uh what the traffic is going to look like or um you know on top of this information right so I think that's a very important use case that I'm that I'm seeing um in terms of like mlops and just kind of um this kind of domain but but yeah I think there's also a lot of from my experience working with different tools in the industry that attempt to provide features for forecasting it's also very tedious to do forecasting today with actually we'll see how simple it is but but and that's why I'm excited about the tool but but from from my experience with you know previously with other tools um you know there's there's always this process that you need to think about like training the models and think about how to organize your data right do all the data engineering how do you put your data into into a format that that you know you can get really good results in terms of forecasting some of those stuff is really challenging and I think hack is doing a really wonderful job um like creating the tooling and making it easier to do all of this and just get value from the data to be able to make good decisions okay so I think we can jump into into our first demo right yeah let's do it okay let's do it I'm excited about it I think like I said right like historically what I've done in the past I've used tools I've actually used a lot of notebooks to do forecasting you know I use something like so I could learn combined with um some data that I found out there I had to do a lot of data engineering to kind of prepare my data right uh look at all the different features um that I'll look uh to use as factors for for casting and kind of training the model and all this stuff is you know the good thing is we have something like I can know that we make this all this process very very efficient and and you can iterate in your on your models and you can try different things out so that's very exciting so I'll share my screen and show you a demo of how I've been using it obviously not sure I think you should be able to see my screen right yes we can see everything yeah so John while I'm going through feel free to to share and any any other um bits that you want to share as I go through I know you have a demo also after this one um but I'll focus on the web traffic I think that's an interesting use case I wanted to cover um so so we go to work here right you need to sign up for an account if you don't have one um and it's very straightforward you when you go into your home um here you have this button that's create new flow right there are many things you can do with adq you can do forecasting but you can also do like the traditional other like predict prediction predictive models um and other stuff like more related to data engineering but here the starting point is you need to kind of get a data set right um and for this case I've chosen I've chosen like a data set that's in a tabular format this data set is something that I grabbed from cable it's a public data set it's we can we keep Wikipedia traffic data sets so it's like you know um all the different projects under Wikipedia we have the Wikimedia we have the comments we have all the different translations of Wikipedia and I've grabbed a few of those and kind of converted them into a nice table format and so here what I'm doing now is I've created this flow right you can give your flow a name right so just give it whatever name you want so here I'm just going to call it uh Wikipedia web traffic so just give it a name there and then you have other features here like chat Explorer I love this feature I've been experimenting a lot in this feature as well so check that out if you have a chance but here I'm focusing on forecasting and there's also a report feature too but you can upload data set or you know just work with the data sets that you have so I would upload data set just to show how I go about working with this there's different formats this is accepted here you have Excel CSV Json and one thing that's important is to have this kind of column name so I'm using a CSV and I'm I'm I'm you know I already have it in in the right format which is like I have a you know the different columns and I need a Time times like sequence right so that's going to be like date for me but I'm just going to upload it and show you how it looks and we can also um obviously if your data is in like a live uh live format like a snowflake or bigquery or something um you can just connect through uh the live data set as well so um and you can work right out of your data warehouse uh if you'd like um as opposed to uploading a CSV uh it's intended to be fully flexible and a little bit later on when we get into some of the like uh retraining discussion we'll talk a bit about like uh why a live data sets kind of matter with time as well and and I've got I've got some examples about like uh you know what happens if you don't retrain your time series models uh uh on a regular basis and stuff so but but go ahead uh take it take it away Elvis sorry yeah thank you um yeah I'm looking forward to that example with the with the live data that's that's something that obviously if you work with logs you're going to need that right so this one is is just a very static data set it actually has um it's web traffic coming from like 2015 to the like 2017 something like that um and you can see a preview of that data set here um cool that is also integrated with chat data prep so chat data prep is like you know if you for for folks doing like data engineering this is pretty useful you can transform data um here as well so other features as well like here notice when I um uploaded my data set I actually have so I have my date which is a kind of a sequence right which I'm interested to to kind of put apply this forecasting model on but I also have these kind of useful features so um like this one's our views right so these are aggregated views um you know Commons Wikipedia Wikimedia has like all these pages and then collectively this is kind of what it looks like this is the aggregated views right so it's a sum of all the um the pages from this particular project and then you have other projects like uh the um English version of Wikipedia and then the media Wiki as well as as its other project as well under this one um I want to convert this into number because you know that's that's kind of what this data set is about um so I'm just actually Auto detect those using an ml engine but sometimes like uh numbers when they don't repeat look like IDs so sometimes you have to check the uh you have to check the identification sometime yeah and it's nice that you can do it here you know you don't need to go back into your like notebooks or anything like that you just convert it on the Fly here that's a very useful feature to have um as you prepare your data okay once that's ready the next step is in the workflow right so here on the on the left hand side I can just go and I would select next Tab and next step is to do the actual forecasting you have also this neat feature to detect analysis which is kind of related also forecasting is very helpful to understand if your data set has anomalies but I'm going to focus on forecasting here um and once you are in this view here in this view on the left hand side you have where you kind of configure you know your forecasting job and you know you have a button here for creating uh the predictive model which is like the training process um and I'm using again the Wikipedia traffic updated the CSV file I just uploaded and the important configurations here is like time I need to pick a time component it already detects it as date so date is what I'm using as as you know to denote that this is a sequence all right and I have like the different days um in that format and then there is like an ID field as well John will talk more in depth about this but this is pretty useful if you have like more than one like sequence as well that you want to work with I think this is a very neat feature it has a lot of potential but here I'm just my data set is pretty basic and I don't have any other like uh sequence information that I want to include so I'll leave that as empty and then also important right like what's the predicted field so I have three different fields that I can predict from um in this example I'm just going to use um I'm just going to use common says sorry I'm going to use en English version of Wikipedia that's that's kind of what I'm looking at and then I'm going to ignore the others right so I actually was doing some forecasting on on the other projects under Wikipedia as well but in here I'm just interested to do if you if you don't ignore them it'll take them as inputs to the forecast and in this case because um I assume like traffic patterns are reasonably similar across the Wikipedia like stuff um it might be kind of cheating to include them right the model will look at you know their traffic levels and sort of extrapolate what the uh what the English Wikipedia is from them right so if you have any other factors or features that you want to use this is where you would pick and choose like uh unselected right so once I have that set take I can now choose like um how far I want to I want to predict that's very important right usually what I've done in the past that I mean the recommendation is you don't want to do you you don't want to go above like the one one-to-one ratio what you want is to do something like you know 30 of the default here is 30 but you can do 40 um of of the data points that you have so if you have like a one-year you know um time span of data you know you can choose maybe four months or something that you want to forecast into the future but again you can you can experiment and the cool thing with with active here is that you can quickly iterate over you know different versions of this you can choose different models different configurations and just you know try it out right this is a matter of just experimenting and try it out and this interface really allows you to do this quickly and easily so I'm just going to leave it as default the recommendation here 30 um and then other features here like if you have some type of aggregation my data set is per day right so I have all the views per day but I could easily convert this into weeks right I could easily convert it into months if I had like you know also hour or minutes or something like that I could have done an aggregation on daily but because this is already in Daily I'll just leave it as is I'll just use no aggregation here and then you can do um you can also choose a training mode so there are two options here fastest and high quality so because I want to experiment really quickly on this I just want to get a feel for how this data is and how you know the models are going to perform on this particular data set I'm just going to choose fastest and then you know do you have an also an advanced feature here maybe John can look into this later and can share more about this but you have other like options right like you have Ensemble modes which are really popular in forecasting um you have your baselines as well like exponential linear and exponential smoothie which is really popular as well um you know you have neural networks by default we're going to compare all of those different model types to each other and pick the one that does the best on a back test um so you know like often like a different model type is appropriate for a different data structure depending on your forecast uh and you know like traditionally you'd have to try a bunch of different models and then look at them and that'd be a really long painful process but um you know the nice thing about an automl engine is it can just automatically try them all and and compare which one does best in history um so usually I recommend leaving that setting just on best back test the only time where I play with it sometimes is it uses 20 percent of recent history as the back test window so you know it takes 80 incentive history and change your data on it and then projects the most recent 20 and Compares which model did the best um if your data set has some large like dislocation due to some environmental effects at the 20 back time frame um then the uh the model that it picks may not be the most appropriate and that's where you might want to go in and uh play around with either the model type that you picked or look at the duration of data that you're feeding it and uh you can go ahead and hit train here and start training the model and uh but um but typically uh you know you do need to consider how your data environment is changing over time um and and that helps inform like uh how much history you want to feed the model and how far forward you want to predict okay um thanks for that I'll just click create predictive model and then I have a question for you John as I go through this um so so for model type how is this how is this um now it's training right it's it's going through the process it's going going to go through it really quickly here but I want to understand a bit more on like what are your thoughts on like training mode and model Type 4 these kind of connected because here I choose fastness right and but I could have easily chose like high quality as well um yeah so the big the big difference is um in the high quality mode we we look at more model types more times um and then we Ensemble some of them uh and we give them more compute and more total processing time uh to learn the patterns in the in the historical data uh and so the way the way it's intended to be used is um for you to iterate you know somewhat quickly on uh using the sort of um fast mode uh and then when you uh when you feel like you have the right variables because sometimes you know maybe you include something that you don't want to include um once you feel like you've got to the right variables then then I would always run the like a high quality mode after that um so so it's intended to like iterate really fast with the fast mode until you're like yes this is how I want to build the model uh and then run it on the longer mode and uh and like I said on the back test um you know I would always choose best back test unless um you have a data structure where you have sort of a significant changing trajectory almost exactly twenty percent uh ago in history yeah yeah that makes sense from design perspective I mean as I'm going through the setting up the configuration right there's a lot of stuff that I'm already setting and this idea of doing like automatically doing this kind of you know ml training and so on that's kind of really useful um so I don't need to really worry about like uh choosing that right but I still need to choose where is processor high quality that I want that's that's really really nice I think it's a nice feature okay um all right so I have some results here based on the configurations that I have here I you know there's different sections that you can look at here so we have like a like a summary of the forecast and it has like accuracy accuracy is really important right you can look at the different details here under accuracy go through this one um you can get like a feel for and with this like preview here right you have different metrics that really matter in forecasting right like you can I mean this is these These are you know some you can see this big errors but obviously I'm working with with with um really high numbers here but and it's traffic related but you can see more or less how how the pattern looks like and how the model how well the model is predicting um this particular Trend right in this particular time series um let's see yeah so usually you want this number to be you know rather low and I think here I think the model did particularly good I think I would say um in kind of uh modeling the behavior um yeah okay I was gonna say let's look at the forecast but uh you can look at the confidence bounds too yeah the confidence bounce is quite interesting I think for forecasting I mean what you want to know is like how confidence right what's the confidence here of this model and how well is kind of um modeling the behavior of your data that's really important and usually with like forecasting right the accuracy will Decay over time that's kind of expected um but you want to see more or less you know if it's one or three months of data um you know ahead as you extrapolate this kind of information how how well it's doing how um the performance of the model kind of degrades um but I think for this model you can see that it's remaining pretty consistent um yeah I think Johnny will cover a bit more on that with your example as well but just wanted to give you like a preview um okay so we have here the actual you know the actual output of this like you can see the history you can see um uh when it's extrapolated powered how it looks you can get a preview you can zoom in as well you can actually look at the the confidence interval as well right what we're seeing here is uh is like the the underlying patterns recognized but this is a tough forecasting problem because there's these spikes in in Wikipedia traffic that uh correspond to like events that um the model just can't anticipate right so it's kind of uh learning the weekly patterns and then that's what you see in the in the bouncing there uh and it's in its forecasts um but but it has a fairly large uncertainty bound um because if you look back in history there's these like occasional spikes where something happens that arrives a bunch of traffic to Wikipedia right and uh and that's the you know that's one of the things about like what you're forecasting is um sometimes um there's things that are hard to predict um and uh sometimes there's things where the patterns are really clear and the way that the way that this works is it always tries to find sort of the clearest pattern that it can extract extract from the history okay yeah I mean one one I think one of the challenges as you mentioned that one of the challenges with broadcasting is like how do you deal with like the old tires there's definitely a lot of different kind of like events here they might be event related but there might be some something else as well of what kind of events I don't know um but yeah I mean how is that also being considered as well like how do you yes there's a couple approaches so you can you can handle them with that chat data prep uh step if you want like uh you can remove like significant outliers before building your model um but but actually some of the models um that we run through the automl engine have outlier handling built into them and so they're designed to handle it so um typically that kind of model would like uh sort of handle the outlier exceptions uh while learning the pattern um and you know with something like this you might get like a slightly better result if you ran it on the longer training mode but you don't want to make people sit around and watch that so uh um but yeah so that's the um that's like uh forecasting uh Wikipedia if you if you'd like I can uh oh and uh us a little bit about the season now we'll look through this one right you'll go through this one and then show the email actually yeah I'll take over and show another couple of forecast examples um that I think are pretty interesting so if uh if you don't mind uh yeah can you stop sharing yeah I will start um okay so um you know what one of the things like so that Wikipedia example is pretty straightforward but let's look at like a sales forecast right um because I think most businesses are kind of interested in forecasting their sales um and and let's look at like a really complicated uh really difficult problem now instead of a simple thing like web traffic um this this is like downloaded from Iowa's state website and it's licorice liquor sales in the state of Iowa um and uh their state has every store um which has a unique ID of the store number uh report on various intervals the amount of liquor that they've sold uh they have like uh you know the category of the liquor the name if it was like tequila or scotch or whatever the vendor name the item description um you know this is basically you could think about this as just like a download from like your uh Financial system of every transaction that you ever uh that you ever have in it um and it's it's complicated because let's say you wanted to build a forecast of the over overall sales uh in the state of Iowa um or let's say you wanted to build a forecast of an individual store like you know I I don't know store 2636 for example um you know that that'd normally be a pretty hard task because you'd have to aggregate everything together by day in order to build the aggregate forecast um or you'd have to like aggregate everything on a store-by-store basis and then forecast each of them individually um but there's controls that you talked about earlier actually make that pretty easy um so you know you just pick the date field again but uh in this case I picked store number so if I want to do a store by store forecast I can simply pick store number and then uh sales and dollars and we'll generate an aggregate forecast that'll show you sort of the weekly sales across every store in Iowa and and this goes back I think this data set runs back to to January of this year but it's like almost a million rows already um I guess they sell a lot of a lot of alcohol um and you know it's usually within 20 but um but what's interesting here is you know you can go down and instead of looking at the aggregate you know we can pick any given store and look at its reporting and see where we think it's individual liquor still sales forecasts are going to go so you can see um you know this store whichever one I picked you know maybe averages around twenty four hundred and we forward forecast that as well um and you can turn the confidence interval on that and sort of see where it sits there too um so so that's like uh makes it actually pretty easy um for you to take like a really complicated data structure uh and and throw it right in the platform and start to generate like store level forecasts and aggregate level forecasts because we'll just unthread them for you um and of course you can go and change your time Windows too uh one of the interesting things about forecasting is you have different like sampling types of data right like you have like a continuous variable um like say temperature uh and and that one's like pretty easy because um you know it's if you sample it every day you've got daily temperature right uh it's a little bit more complicated if you're only measuring the temperature intermittently like let's say you know you're you only take that measurement every other day or sometimes you miss a day um then you might want to take a mean aggregation and you might want to come down here and aggregate bi-weekly and look at the mean temperature on a weekly basis we make that really easy to do um and then sometimes you know you have intermittent uh variables where you're like ordering uh uh you know say uh widgets for a warehouse um and you know you don't make an order for most of the time and then occasionally you run out and make another order um thank you and if that's how your customers behave you know enter you know you're you're not like a SAS business but you're you're ordering for fulfillment based on use um that's where aggregation like putting things into time blocks where there's values in every time block becomes really helpful because then you can start to generate really clean forecasts um you know just just quickly we also extract seasonality it looks like Thursday is a big day for sales and you know Sunday not so much and you can look at that on a monthly basis too uh it looks like maybe there's a dry January going on you know like and as soon as dry January ends people get back to drinking in February I suspect um and then we do some Auto segmentation like you know here's the stories that we think are kind of growing the most and here's the ones that we think are kind of declining the most in the average growth and average decline so you can kind of dig in a little bit further to the data um so so that's um that's like how you would use this for a much more complicated uh look at things um and and you can see like um you know the the confidence bounds are are increasing so as you get further out just like predicting the weather you get more uncertain the next kind of important thing I want to quickly cover is when data environments change like and I think covet is a really good example if you just looked at history all of a sudden your models break so it's important that you know as you're using these models if they're connected to live data sources that they can automatically retrain as your data environment changes and as more points get added and that's a cool thing that we enable too which is um you know if you hook us up to your Snowflake and you're building a forecast out of it um you can like have the model automatically retrain so actually over here um this is a just a data set that you know I like because it's the ual United Airlines like uh passenger count I think in San Francisco is airport or something um and uh you know it's it's plugging along and you know if we got up here to like uh the end of 2019 uh and and you were building this forecast uh you would think that the traffic was going to keep going up right um and and this it intentionally cut off right before covet happens um and you know like it's it's got a pretty clear view of what's happening with traffic um but but this model actually here contains the further data of like what happens when all all travels basically halted due to covid and then it slowly starts to pick up well um you know if you turn in on that the model is really not sure what's going on right it's it's it's seen this high traffic but it's also seen this section of like almost no traffic um and so in these like so it's going to try to like figure out what's going on and build the best forecast it can um but this is a period where you're going to want to watch and sort of uh you know like make sure that your models are sort of retraining and and maybe you really need to think about is it appropriate to only show at a shorter term history here as opposed to longer term history um so there is still a lot of like business analysis that goes into making any sort of model um just the same way like business uh like businesses were impacted with covid um you know that the models are as good as the history is in predicting forward so when things are changing quite a bit uh pay attention and retrain them more often uh and then um one more feature I just wanted to show you this is a model that just predicts temperature in uh in Delhi um is that we'll also sort of extract um driving factor analysis here on the bottom so um I just wanted to kind of uh show you what that looks like so um it you know like uh leading indicators is what we call it so uh actually like this is a multivariate forecast so we consider humidity wind speed and pressure when we're forecasting the temperature here um and then we also try and tell you like hey look um wind speed tends tends to be at a biggest impact about a three month lag and I'm assuming that's due to like seasonal wind changes um and they're they're correlated in the same way and then we show you like in the last three months um we had about a 29.6 decrease in the trend of wind speed and we think that's going to drop the temperature by about six degrees right um and so you know you can even start to understand you know what the time factors are that your input variables are driving to your outputs If This Were a sales forecast this could be your marketing funnel you know it could be like when your leads go up that and three months later or whatever your sales cycle is um you're converted into this much revenue at this percentage uh in the last three months your leads went up by 10 you know we think that's going to have something like a one percent increase in the revenue forecast in the next three months as a result whatever your conversion rate is right um so so it's not just a forecast that you get um you also get a lot of analysis uh around what's driving it and what the lag times between the inputs and outputs are if you're doing a multivariate uh forecast like this so I just wanted to to share that too um okay so that that's the feature um and uh and some of the like complexity that you can do with it to sort of easily make forecasts um it seems like we've got some questions coming in uh Elvis so let's uh let's pause here and uh go to the go to the questions so the question here is like um you know the model sometimes generates obvious results uh higher prices lead to higher Revenue well well yeah um you know like uh that that's true um and and the models trying to surface sort of the highest impact um you know relationships so uh it's it's pretty frequently the case that if you know your business you know what the high order impacts are um so that if you're if you're interested in the relationship with a lower order impacts then the way to do that is to ignore uh and in this example it says higher prices lead to higher Revenue um if you ignore prices and then do the revenue model it'll look at the other variables that are meaningful and tell you what they are and the ways in which they contribute or um you know you can always just sort of keep drilling down in the driving factor analysis to see if the you know the lower level like contributing factors usually what I find is there's some that you didn't realize were as important as they are and somewhere you knew they were important but you didn't really know the magnitude like of importance very well and and that helping uh understand that makes a big difference so um but but the simple answer is if you if you want you can always just put in one input variable and look at its relationship to the output uh as opposed to putting them all in there so so sometimes if I just want to know the relationship between um you know like call it item number or description and uh and revenue I might only include item number in my forecast and and see what it does so what's the um more kind of my question um what's the like the like the recommended way to go about this if you have like different you know different variables that you think you know might play a role in this how do you go about like choosing what's the kind of the best strategy um you know my normal approach is to include um include as many of them as you have uh and and the way that the way that the engine should work is it should surface the important ones and and sort of show you like yeah we found a pattern between these variables and the output um or no you know we didn't really find a good pattern between these variables and the output so um usually like uh and you know this is why we have a fast mode um I usually start with as many as there are and then you know maybe I play around a little bit with uh removing some of them and looking at some of the other like uh relationships um so you know it's uh it's flexible and that you can iterate and train models very quickly um you can even kick them off in parallel if you just open a new tab you don't you don't even have to wait around once you start a training job it's running in the cloud you can you know close the tab and come back into that model later and it'll be done training um okay uh let's keep going down the list um how about the matter like the cryptocurrency market uh most probably does not depend on historic data but more on different factors like geopolitics and so on yeah yeah so um that's where you'd want to include geopolitics in your data set if you were going to be training the model um to predict crypto prices and it was dependent on geopolitics um now that couldn't be tricky um you know I don't know like um if you have a like a political uh sentiment like model um that has a score for how like uh positive or negative um people are on cryptocurrency but but some Factor like that that uh you could update over time along with the prices um and it'll be able to find the patterns for you so um you know the the one way uh traditionally to make a ml model better well you can do a lot of stuff with like training longer and and some data cleaning but usually if you get more data that's relevant to your outcome that you don't currently have that's the biggest dial you can turn in terms of improving your models um so that's pretty typical um okay uh let's see um are there tips to know for more accurate multi-varied uh predict points selected um more accurate multi-variable uh predict points it's related to the one I ask it's submit related yeah okay um you know like uh so let's see uh like I said I think you kind of can throw the data you have that you think is relevant in there and will try and surface how relevant it is um but I would also play around with your forecast window um and uh you know like um you know like sometimes like people try and forecast like three years forward off one year of data I think we actually block you from going anything over one to one so if you have a year of data we won't let you forecast more than a year forward um but but even that is pretty that's that's like uh pretty tough you're saying like here's like half of the total duration example like tell me the other half um I would keep your forecast down to around 25 25 30 that's why we default it like that so um but but for the multivariable inputs throw them in there and see if they matter uh and we'll try and find the time relationships not just the magnitude relationships right if there's a time offset we're gonna try and find that for you too okay um another interesting one here from Anonymous how the how the models handle when there is scarce they data for example for SKU with low sales activity really good question yeah so that that's where you want to use the time-based aggregation so um you know if that if that SKU is only selling once a month um you're probably going to want to aggregate monthly uh and when you turn on the monthly aggregation uh you want to set the window so that you have a data point in every one of your time Windows uh and then um you can choose if you want the sum or the mean and you know if you're working with something like temperature I think you'd want the mean because you're going to assume that the other missing measurements are like the average uh but but if you're working with something like a a low sales SKU in that case uh you're going to want it to be um you know the the sum and so you know if in some of your one month Windows you've got like three sales it'll add them up automatically for you um and if in in some of your month Windows you just have one then it'll just have one sale and then and so it's like a smoothing function almost um and then you'll forward forecast the monthly sales um what what it probably can't do is tell you like exactly which day your random scarce like SKU activity is going to occur on um but it will tell you to the resolution that it can uh call that monthly um how you can uh how you can forecast forward that SKU cool um let's go to the one at the top um are you including other variables also to predict sales or only dates variables predict sales okay I think I did this one already yeah yeah we um we include the other variables too um so it's uh it's all multivariate uh so you can send us like other things that matter to the outcome and then uh can we expect in future models to take into account incidents and customer Behavior Uh maybe off the internet to give the forecast yeah that that's a very interesting question um you know it's it's uh it's an area where I think we'll have to invest over time as a company um because as I said earlier um the the best way to make a model better is to get information relevant to the outcome that you don't have into the model for it to look at uh and so in our in our medium and long-term roadmap um we're looking to to do a lot of stuff around data augmentation where you can bring stuff from the internet uh and and maybe even like answers from large language models um into your data set to augment them uh today there's large language models are a little bit slow so if you're dealing with like a hundred thousand rows of data it gets really hard to augment them on a row by row basis um and you have to have some level of comfort with sharing your your row by row data information with the llm provider um but uh you know because today when we uh we don't we don't share like uh record level information um with with large language models but um I suspect uh over time that they're going to get faster and that more and that they're going to get more capable so um absolutely I think in the future uh models will be able to pull in relevant info they're probably going to need to be guided right you're going to need to tell them I you maybe you can even ask them what what information would be relevant and they might have an idea but you as the business Expert or subject expert probably should also make a few suggestions of the type of info that you want to include yeah I was going to add that as well like how you know how how does like other features you know like the chat data prep and all the other like features for data engineering how do they you know how do they play a role in it seems like they could play a role in the in the whole you know forecasting the modeling trying to find out what are the features what are recommend the features more like you know like automated way to actually to get some automation seems like a like a good yeah we have we have some really cool ideas there um you know we've we've started like uh on the front end with the chat prep and chat Explorer um where you can go text to uh text to answer text to chart um or text to data transformation um actually you know there's uh the passengers uh getting off in uh San Francisco I just told it to remove uh all right all rows that happened after covet started um and then those were the two different data sets I showed you and I I did that transform right in platform is very easy so um you know there that can be extended in a whole bunch of really interesting ways um so I'd encourage everyone to sort of stay tuned as we start to extend uh llm capability into things like forecasting and stuff um how exactly to add variables like geopolitics uh and does the model look for current factors in a region as well as Global so so um again the answer here is uh today um you need to bring the data to the table um and uh so so we don't have a mechanism of augmenting geopolitical factors you'd need to figure that out and include it in your data set if you're building a forecast um if it's your business um and that's our primary customer is businesses forecasting business operations things um you probably already have most of the data that's relevant to your outcome somewhere in your business and that's like your demand your inventory your orders your marketing Pipeline and all of that stuff um but if you're doing something uh like cryptocurrency uh in this example um you're gonna have to go get that data yourself today um will it bin frequencies like a spectral analysis we will um you know we will look uh to learn like uh different frequency patterns inside of data um it's it's not like a Fourier transform or anything uh you know we're still just going to show you a normal like forecast over time um we don't really uh allow you to tease them apart too much other than looking at the seasonality um but it will learn like uh if there's a slow pattern and then a very fast pattern on top of it it will learn uh the multiple frequencies in your data so let's say you have like a weekly pattern a monthly pattern and a quarterly pattern um you know we're sort of surfacing that to you in the seasonality uh view um but the models themselves are learning all of those patterns and that's how they're able to like reproduce the forward forecasts um and and if you looked at the temperature one you could see um that it had like a large pattern of like seasonal temperature um but then it also had sort of like a sharper faster daily or weekly patterns uh as well um cool okay um well I think we're running up against time here uh I you know I'd like to thank Elvis for uh for hosting with us and and uh helping do the demo um any last uh parting thoughts Elvis yeah I had one more question for you I think this one you know I got a lot when I used to teach back in the days were like doing elasticsearch and so on right with the elastic stack a question that usually came up is like how do you deal with this scale um you know when you deal with like logs data for instance right you're getting a lot of data the volume is really high um unusually tools would be you know quite Limited in terms of like how much they can forecast and how much data it actually takes but when you're dealing with like real-time data like this it makes it quite challenging and there's always this kind of scale component of it can you can you speak about like you know if there are any plans anything about scale that you can share any use cases or anything like that yeah I mean we so we have customers operating at scale I think our biggest one uh is building off a 260 million row input data set doing lifetime value predictions um so so we're reasonably uh we're designed to work with large formats of data um but with time series in particular um you know I showed you a data set that was like a million rows that was that was just this year so far for the state of Iowa so um you could also train that model to you know go back uh multiple years those data sets get pretty big um the the real consideration that happens there um is that the model takes a little a little bit longer to train even if it's in fast mode um so so I would tell everybody that if he if you're bringing like you know one to ten million rows of time examples um you know uh expect that that model might take 15 minutes to train uh maybe maybe even an hour if it's really big um you know we've spent a lot of time working on the efficiency of our ml engine I still think that's pretty fast compared to what a lot of the training mechanisms are um but but yeah uh the main trade-off is as your data size really scales or if you're including a lot of text data because we encode all the text those two types of things can add time to the training process and you should just kind of be aware that at Large Scale that's possible now if you've got to connect it up to a live data source and it's automatically retraining um you know once you get it built the first time it's just going to take care of that in the background for you so you don't have to worry your weight around it'll just send you an update once when it's all finished that's awesome that's a really cool feature to take advantage of your real time uh training retraining um there's another question here if you don't mind um so would there be a feature when we can understand statistics in simple common language many a business people do not understand statistics so do we see a feature where forecasting is made easy for a common one yeah so that's what we're trying to do and and you know I'll say we're not all the way there yet but um one thing we do is every time we tell you a piece of information in the platform you can hover over that piece of information with your mouse and we'll actually explain it in a little bit more detail to you when we tell you something like root mean square error or other or something like that we also Define square right next to it um and we tell you like uh how you should try and optimize around it um so so you know um I don't I don't think we'll be inventing like a new language for statistics or something like that um but we are uh gonna try and make it really easy to understand the information visually um and we're going to explain like why it matters to you in as plain of languages we possibly can at every step along the way uh and so I think I think if you play around for a little while in our platform you'll find it's uh you know it's it's getting easier every day um and and we're committed to making it as easy as we possibly can right we don't want you to have to have like a deep knowledge of Statistics to use the platform the idea here is um if you could if you could use Excel you should be able to use accio and and that's sort of what we ask ourselves every day is how we cleared that bar and you know I'll say um we've still got lots of work to do but um you know that we we wake up uh and work on it every day thanks a lot production it's really helpful cool so those are the questions well I think that takes us to the end if you have any questions um you know we're always happy to engage and help out and uh like like we said at the beginning the platform is open uh please feel welcome to make a make a free trial account and try it out yourself all right thank you everyone thank you

Show more
be ready to get more

Get legally-binding signatures now!

Sign up with Google