Speaker: Scott Klein, Technical Evangelist Microsoft
Summary: As cloud computing becomes more popular and cloud-based solutions the norm rather than the fringe, the need to efficiently migrate your database is crucial. This demo-filled session will discuss the tips and tricks, methods and strategies for migrating your on-premises SQL Server databases to Windows Azure SQL Database, AKA SQL Azure. Focusing primarily on SQL Server Data Tools and the DAC Framework, this session will focus on how these tools can make you a kung-fu migration master.
About Scott: Scott Klein is a Corporate Technical Evangelist for Microsoft focusing on Windows Azure SQL Database (AKA SQL Azure) and related cloud-ready data services. His entire career has been built around SQL Server, working with SQL Server since the 4.2 days. Prior to Microsoft he was a SQL Server MVP for several years, then followed that up by being one of the first 4 SQL Azure MVPs. Scott is the author of over ½ dozen books for both WROX and APress, including Pro SQL Azure. He can be found talking about Windows Azure SQL Database and database scalability and performance at events large and small wherever he can get people to listen, such as SQL Saturday events, local SQL Server user groups, and TechEd.
Question: Is there a “per transaction” cost for Windows Azue SQL Database (SQL Azure)?
Short Answer: No
I recently answered the question on MSDN forum where the question was about Transactions and the associated cost in SQL Azure. As of now, There is no “per transaction” cost associated with SQL Azure. There are two parameters that affect your SQL Azure Bill: 1) Database Size 2) Outbound Data Transfer and an example of an outbound transfer would be data access by an application hosted outside of your Azure DB’s data-center.
If you want to read more about SQL Azure pricing, here’s the official resource:
I normally Blog about the answers that I give out on MSDN forums. The answer on MSDN forum is generally brief and to the point and in the blog post – I expand it to cover related areas. Here are the questions for which I didn’t choose to write a blog. So I am just going to archive them for now:
With all the news about “Big Data”, I had a question:
Where does Big Data come from?
So I researched and here are the Big Data “Sources” that I found:
1. Enterprise data (emails, word documents, pdf’s, etc)
3. Social Media
4. Sensor Data
5. Public Data (energy, world resource, labor statistics etc)
Am I missing anything? Please feel free to point those out!
Update [11 July] : Claytonbingham adds “Academic Data” to the list, which I think sometimes can also be referred to as “scientific data” (Thanks Claytonbingham! please refer to comments section for his thoughts)
Microsoft project codename “Social Analytics” is one nice beta project! Quoting from it’s site –
“it is aimed at developers who want to integrate social web information into business applications”
But the KEY here is that it allows you to integrate FILTERED social web information into your business applications. Today, you could go ahead – grab a twitter stream data – embed it in your application but guess what? In most cases, it’s too much information. Too much information means, that it’s very difficult for business-folks to take actions by analyzing these truckloads of information. And so:
even though we are data-rich – we are information (insight) poor.
My point being, that tons of information PRODUCED by [customers, partners, critics, employees..] GATHERED from [Twitter, Facebook, Linkedin, Blogs…] is NOT useful in it’s raw form. To take actions based on all these data-points – what we need is a way to categorize data (filter data) which would help the decision maker in seeing only SMALL part of data-set he/she needs for performing that particular analysis.
Let’s take an example:
A business-decision-maker wants to see “All twitter-users who have posted positive reviews about Windows 8 Design and User Experience”
How would you solve it?
Thinking of writing your own Sentiment Analyzer? awesome, & Good Luck!
Any-who, may be you know it’s not straight-forward to answer the above question using raw twitter data.
But here’s the thing you could use Third-party tools to solve the problem. Don’t get me wrong, I am not asking you to ignore them. But here’s how Microsoft Social Analytics helped me solve the above problem:
Here’s how I FILTERED the data: (It’s a thing called Social Analytic Engagement client)
And as you can see there are more than one ways you can slice/filter your data to provide a view that is best suited for a particular analysis assignment.
1. Currently in beta: only two data-sets are available i.e. Bill Gates & Windows 8.
2. Apart from this nicely designed web based engagement tool, you can integrate the information into applications using Social Analytics API.
Up until April 2012, the only way to run SQL server on cloud was “SQL Azure”. But recently AWS announced SQL Server on Cloud. Good news? Probably. it’s always good to have more than one option. So for those who are new to world of AWS, here are few tips before you get hands-on:
1) The way RDS works is that you spin up “db instances”. So here you specify the machine size that would “power” your database. And remember that the type of instance you choose would directly affect your bill.
2) Spend some time understanding the billing structure. Since AWS gives you lot of options – their billing structure is not simple. Don’t get me wrong, I am not saying that lot of options in AWS is bad. it’s just that the billing is not simple and it’s not one-dimensional (there are various dimensions that shapes your billing structure). And why should you invest time? because in the “pay – as – you – go ” model it would directly affect your Bill.
3) understand costs like: cost to back-up database PLUS data-transfer cost.
4) Understand the difference between “Bring your OWN license” and “license included” (Express, Standard and web only. Currently enterprise edition not included here) model in RDS SQL Server
5) and unlike SQL Azure, RDS SQL Server charges on a “per hour” basis.
Note the date of this post: 15th may 2012. Things change very fast, so readers-from-the-future please refer to official documents.
BTW, here are the few blog posts from the web-o-sphere: