This is part of the big data problem that a lot of major companies are dealing with. Though the chances are good that you won't ever develop terabtyes of data on daily basis, but you can generate quite a bit and still have concerns. For example, I regularly work with 100m user data sets that are in the 20gb file size.
MySQL seems to be the norm right now, coupled with tools like hadoop/hive/aws. I don't work with the data warehousing side on a e-commerce level so I can't comment on the db architecture. If I were to approach it I would definitely try and keep a series of static tables though, not just an annual one (customer, job, page, etc). By segmenting the data and using guids with proper keys you should be able to keep things moving along fast enough. Also, remember that for a lot of reporting/analysis, you shouldn't be working off a production server, I keep a small, slightly outdated data set, on a local MSSQL server running so I can bog it down without pissing off the bosses when I need new queries.