Does stage table gets all the data in every load?

I have a table called Employee in my source which has 100 records. It is modeled as a hub and sat. When i load data from source to stage 100 records will be inserted to stage. then distinct records will be inserted to hub and sat from stage. Assume after 1st load employee table got 5 new rows, now total records are 105 records. so in the 2nd load we truncate stage table and load all the 105 records then loads from stage to hub and sat. after N number of loads source records count may gets huge. So what is the use of truncating and loading all the data into stage table again and again?
is it good to follow ?
What are the advantages and disadvantage on this type of loads ?

In the DVA VTCP model, they had designed DV model with the help of views. so there is no stage truncate process. Stage views have filter to not load existing data in raw vault tables.

The reason for truncating stage is that data might change. You dont wanna miss out of changes in the source systems.
It is vital to follow this aporoach.
Advantage is that you can replicate the source. Disadvantage is that it take some time to load all the data.

Why do you need to do that?

Landed content should be new either as a snapshot of the source or CDC result, staging can be deployed as a view (no data movement). If you have all history in staging then this is known as persistent staging area (PSA). Then the question is, why are you deleting content in there?

A “growing” stage needs additional logic to ensure you are only comparing changed content with the target for loading.