jump to navigation

User Group Meetings Next Week (free training everyone!) July 11, 2014

Posted by mwidlake in Exadata, Meeting notes, UKOUG.
Tags: , , , ,
add a comment

I know, posts about up-coming user group meetings are not exactly exciting, but it’s good to be reminded. You can’t beat a bit of free training, can you?

On Monday 14th I am doing a lightning talk at the 4th Oracle Midlands event. The main reason to come along is to see Jonathan Lewis talk about designing efficient SQL and then he will also do a 10 minute session on Breaking Exadata (to achieve that aim I suggest you just follow the advice of the Oracle Sales teams, that will break Exadata for you pretty efficiently!).

If you are not familiar with the Oracle Midlands events, they are FREE evening events run in Birmingham, just north of the center in Aston, at the Innovation Birmingham centre. See the web site for details and to register. The great thing about them being in the evening is you do not have to take time out from the day job to attend. The disadvantage is they are shorter of course. (And for me personally, I feel morally obliged to pop in on my dear old Mother on the way and listen to her latest crazy theories. I think it’s what Mum’s are for). Samosas were provided last time to keep you going and I know a couple of us will retire to a near-by pub after, to continue discussions.

I’ll be doing just a short talk, along with half a dozen others, my topic being “is the optimizer getting too smart to be understood”.

This is a user group in the truest sense of it, organised pretty much by one chap (Mike Mckay-Dirden) in his spare time, with help from interested people and some financial input from the sponsor Redgate.

I get a day off and then I am at the combined RAC CIA and Database SIG on Wednesday 16th. This UKOUG SIG is probably the other end of the user group experience (as in from an organisational and size perspective). They both fulfill a need and I have no problem being involved in both. In fact, I now notice that Patrick Hurley is going to be at both events too.

The RAC CIA & Database SIG is also free, IF you are a member of the UKOUG. You can also attend if you pay a one-off fee. It’s an all-day event and as it is a combined SIG it is a two-track event. It’s almost a mini-conference! Presenters include myself (doing my intro to Exadata talk, probably for the last time), Julian Dyke, Patrick Hurley, Martin Bach, Neil Chandler, Neil Johnson, Martin Nash (twice!), Ron Ekins, John Jezewski, Alex Evans and David Kurtz. If that is not enough, Owen Ireland is going to give a support update and then we have Mike Appleyard giving a keynote on a brand new 12.1 feature, Oracle Database In-Memory option.

I’d be going along even if I was not presenting or helping run the RAC CIA SIG and I’m retired for goodness sake! (well, sort of, my wife has not ordered me back to the working life yet). And of course, we will no doubt retire to a hostelry after (to count how many Neils and Martins are involved).

I hope to see as many UK people as possible at these two days. As I said at the top, it’s free training, you can’t get better than that.

RMOUG – Here I Come February 9, 2013

Posted by mwidlake in Exadata, Meeting notes, Private Life.
Tags: , ,
add a comment

Well, I’ve just finished pushing the last few bits into my suitcase for my trip to the US for the Rocky Mountain User Group Training Days 2013.

It is a few years since I went to the US for pleasure (3 years?) and much longer since I went there on a combined work/pleasure trip – as I HATE going through US immigration.

I was tempted into this trip when I met up with a fellow OakTable member Tim Gorman at the Slovenia User Group last October. Tim was a true gentleman throughout that conference (and this is not to detract from the kind ministrations of Jože Senegačnik who was a wonderful host and also a fellow OakTable member) and he suggested I put forward a talk or two to the RMOUG training days conference. I was in two minds due to my huge dislike of being shouted at by the {I am sad to say, usually bl00dy unfriendly} members of the US immigration services. I mean, I went to Moscow in around 1994 and those chaps in the USSR were positively oozing cordial welcome compared to the hard-nosed and antagonistic attitude of the US chaps a year or two later!

Anyway, Tim swayed my opinion in a simple and direct manner. He mentioned Skiing. I went skiing in 1992. My one and only Skiing holiday. I loved it, I spent 2 weeks going from terrified starter to someone who could swish down Blue runs with the rest of them, the occasional Red run towards the end. Not Black, I saw enough sad-looking stretchers coming down from the Black runs as I sat in the bars at the bottom to think there was something to be scared of there – and when I chatted to the barman in my chosen watering hole he told me how so many of them were us Brits who had lost the brains to realise what our limits were.

So, I will be at the conference doing my bit about “first things to know about Exadata” and then I will be up in the mountains, scaring myself on hills that when I was 20 looked like a walk in the park…. What is the betting I do not get as far as Red runs this time?!?

Friday Philosophy – I Am An Exadata Expert August 10, 2012

Posted by mwidlake in Exadata, Friday Philosophy, Perceptions.
Tags: , , ,
5 comments

(Can I feel the angry fuming and dagger looks coming from certain quarters now?)

I am an Exadata Expert.

I must be! – I have logged onto an Exadata quarter rack and selected sysdate from Dual.

The pity is that, from some of the email threads and conversations I have had with people over the last 12 months, this is more real-world experience than some people I have heard of who are offering consultancy services. It’s also more experience than some people I have actually met, who have extolled their knowledge of Exadata – which is based solely on the presentations by Oracle sales people looking at the data sheets from 10,000 feet up and claiming it will solve world hunger.

Heck, hang the modesty – I am actually an Exadata Guru!

This must be true as I have presented on Exadata and it was a damned fine, technical presentation based on real-world experience and I have even debated, in public, the pros and cons of point releases of exadata. Touching base with reality once more, I did an intro talk “the first 5 things you need to know about Exadata” and the “debate” was asking Julian Dyke if he had considered the impact of serial direct IO on a performane issue he had seen and he had not only done so but looked into the issue far more than I – so he was able to correct me.

But joking aside –  I really am a true consulting demi-god when it comes to Exadata

I have years of experience across a wide range of Exadata platforms. That would be 0.5 years and I’ve worked intensively on just one system and am in a team now with some people who are proper experts. So a range of two. Yes, tongue is still firmly in cheek.

This situation always happens with the latest-greatest from Oracle (and obviously all other popular computing technologies). People feel the need to claim knowledge they do not have. Sometimes it is to try and get consultancy sales or employment, sometimes it is because they don’t want to be seen to be behind the times and sometimes it is because they are just deluded. The deluded have seen some presentations, a few blog posts and maybe even got the book and read the first few chapters and are honeslty convinced in their own minds that they now know enough to make effective use of the technology, teach {or, more usually, preach} others and so proclaim on it. {See Dunning Kruger effect, the certainty of idiots}. I’m certainly not arguing against going to presentations, reading blogs and books and learning, just don’t make the mistake of thinking theoretical, second-hand knowledge equates to expert.

With Exadata this situation is made worse as the kit is expensive and much of what makes it unusual cannot be replicated on a laptop, so you cannot as an individual set up a test system and play with it. Real world experince is required. This is growing but is still limited. So the bullshit to real skills quotient remains very, very high.

If you are looking for help or expertise with Exadata, how do you spot the people with real knowledge from the vocal but uninformed? Who do you turn to? {NB don’t call me – I’m busy for 6 months and I really am not an expert – as yet}. If your knowledge to date is based on sales presentations and tidbits from the net which may or may not be based on a depth of experience, it is going to be hard to spot. When I was still without real world experience I had an unfair advantage in that I saw email threads between my fellow OakTable members and of course some of those guys and gals really are experts. But I think I was still hoodwinked by the odd individual on the web or presenting and, I can tell you, though this background knowledge really helped – when I DID work on my first exadata system, I soon realised I did not understand a lot about the subtulties and not-so-subtulties of using a system where massively improved IO was available under key conditions. I had to put a lot of time and effort and testing to move from informed idoit to informed, partially experienced semi-idiot.

I know this issue of the non-expert proclaiming their skills really frustrates some people who do know their stuff for real and it is of course very annoying if you take someone’s advice (or even hire them) only to find their advice to be poor. Let’s face it, is is simple lying at best and potentially criminal mis-selling.

I guess the only way is for peopel needing help to seek the help of someone who has already proven themselves to be honest about their skills or can demonstrate a real-world level experience and success. I would suggest the real experts should do that most difficult task of pointing out the mistakes of the false prophets, but it is very tricky to do without looking like a smartarse or coming over as a big head or jealous.

I’ll finish on one thing. Last year I said how I thought maybe I should do more blog posts about things I did not know much about, and be honest about it and explore the process of learning. I did actualy draft out about 3 posts on such a topic but never pushed them out as I was way too busy to complete them… That and, being candid, I really did not want to look like an idiot. After all, this Oracle lark is what puts beer in my hand, hat fabric on my wife’s millinary worktop and food in my cat’s bowl. The topic was….? Correct, Exadata. Maybe I should dust them off and put them out for you all to laugh at.

Oracle Exadata – does it work? July 26, 2009

Posted by mwidlake in development, VLDB.
Tags: ,
12 comments

Does Oracle Exadata work? 

That is a tricky question as, unlike most Oracle database features, you can’t download it and give it a test.

You can try out partitioning, bigfiles, oracle Text, InterMedia {sorry, Multimedia),} all sorts of things by downloading the software. You can even try out RAC pretty cheaply, using either VM-Ware or a couple of old machines and linux, and many hundreds of Oracle techies have. The conclusion is that it works. The expert conclusion is “yes it works, but is it a good idea? It depends {my fees are reasonable}” :-).

I digress, this ability to download and play allows Oracle technophiles to get some grounding in these things, even if their employer is not currently looking to implement them {BTW how often do you look at something in your own private time that your company will not give you bandwidth for – only to have them so very interested once you have gained a bit of knowledge? Answers on a postcard please…}.

Exadata is another beast, as it is hardware. I think this is an issue.

I was lucky enough to get John Nangle to come and present on Exadata at the last UKOUG Management and Infrastructure meeting, having seen his talk at a prior meeing. John gave a very good presentation and interest was high. I have also seen Joel Goodman talk {another top presenter}, so I understand the theory. I have to say, it looks very interesting, especially in respect of what is ,perhaps, my key area of personal expertise, VLDB. Databases of 10’s of terabytes.

I don’t plan to expand here on the concepts or physical attributes of Exadata too much, it is enough to say that it appears to gain it’s advantage via two main aspects:-

  • Intelligence is sited at the “disc controller” level {which in this case is a cheap 4-cpu HP server, not really the disc controller} which basically pre-filters the data coming off storage so only the data that is of interest is passed back to the database.  This means that only blocks of interest are chucked across the network to the database.
  • The whole system is balanced. Oracle have looked at the CPU-to-IO requirements of data warehouses and decide what seems to be a good balance, they have implemented fast, low latency IO via infiniband and made sure there are a lot of network pipes from the storage up the stages to the database servers. That’s good.

The end result is that there is lots of fast, balanced IO from the storage layer to the database and only data that is “of interest” is passed up to the database.

It all sounds great in theory and Oracle Corp bandy around figures of up to 100 times (not 100%, 100 times) speedup for datawarehouse activity, with no need to re-design your implementation. At the last M&I UKOUG meeting there was also someone who had tried it in anger and they said it was 70 times faster. Unless this was a clever plant by Oracle, that is an impressive independent stated increase.

I am still very interested in the technology, but still sceptical. After all, RAC can be powerful, but in my experience it is highly unlikely that by dropping an existing system onto RAC you will get any performance (or high availability) increase. In fact, you are more likely to just make life very, very difficult for yourself. RAC works well when you design your system up-front with the intention of working on the same data on the same nodes. {Please note, this is NOT the oft-cited example of doing different work types on different nodes, ie data load on one node, OLTP on another and batch work on the third. If all three are working on the same working set, you could well be in trouble. You are better off having all load, OLTP and Batch for one set of data on one node, OLTP-load-batch  for another set of data on another node etc, etc, etc. If your RAC system is not working well, this might be why}.  Similarly, partitioning is an absolutely brilliant feature – IF you designed it up-front into your system. I managed to implement a database that has scaled to 100 TB with 95% of the database read-only {so greatly reducing the backup and recovery woes} as it was designed in from the start.

Where was I? Oh yes, I remain unconvinced about Exadata. It sounds great, it sounds like it will work for datawarehouse systems where full table scans are used to get all the data and the oracle instance then filters most of the data out. Now the storage servers will do that for you.  You can imagine how instead of reading 500GB of table off disc, across the network and into Oracle memory and then filtering it, the  eight disc servers will do the filtering and send a GB of data each up to the database. It has to be faster.

BUT.

What if you have some OLTP activity and some of the data is in the SGA? That is what stops full-table-scans working at Multi-Block-Read_Count levels of efficiency.

What happens if some of the table is being updated by a load process at the same time?

 What happens if you want some of the data hosted under ASM and full Exadata performance brilliance but you have several 10’s of TB of less-active data you just want to store on cheap SATA raid 5 discs as well? How does Exadata integrate then?

You can’t test any of this out. I did email and ask John about this inability to play with and discover stuff about a solution that is hardware and very expensive. And he was good enough to respond, but I think he missed the point of my question {I should ask again, he is a nice chap and will help if he can}. He just said that the DBA does not have to worry about the technology, it just works. There are no special considerations.

Well, there are. And I can’t play with it as I would need to buy a shed load of hardware to do so. I can’t do that, I have a wife and cat to feed.

So even though Exadata sound great, it is too expensive for anyone but large, seriously interested companies to look in to.

And I see that as a problem. Exadata experts will only come out of organisations that have invested in the technology or Oracle itself. And I’m sorry, I’ve worked for Oracle and as an employee you are always going to put the best face forward.  So, skills in this area are going to stay scarce unless it takes off and I struggle to see how it will take off unless it is not just as good as Oracle says , but better than Netezza and Teradata by a large margin.

Does anyone have an exadata system I can play on? I’d love to have a try on it.

Follow

Get every new post delivered to your Inbox.

Join 158 other followers