Cloud and Microsoft technologies enthusiast architect in Switzerland RSS 2.0
# Wednesday, April 08, 2009

It was a tough night to prepare the demo for the TechDays in Bern. Installing the PDAs and struggling with some of them on which it was not possible to install the CRM. Fortunately, after some hard-reset, everything was ready for the H-Hour.

Have a look at the install in the Hotel room :-) :

Thanks to Julien and Didier for the PDAs !

Wednesday, April 08, 2009 2:31:54 PM (GMT Daylight Time, UTC+01:00)  #    Comments [2] -
TechDays
# Wednesday, April 01, 2009

The next version of the framework will be more focused on dynamic languages and will keep the C# and VB.NET in parallel.
It also introduces extensions to support the parallelism in a declarative and imperative way. In that extensions, we will have PLINQ (for Parallel LINQ) to parallelize LINQ queries, TPL (Task Parallel Library) and CDS (Coordination Data Structure).
One of the new feature is the ability to install the client profile on every configuration whereas for the previous versions, it needed to be installed on a clean infrastructure.
Among other things, new controls for WPF, multitouh support under Windows 7 and an easyness to develop user interface based on datasource even though the Entity Framework is not yet supported in the presented version. The MVC will be implemented with a dynamic data support.
On the WCF side, some enhancements have been implemented such as SOAP over UDP and more WS-* support. Moreover, building a RESTfull application will be simplified.
With the .NET framework 4, WF and WCF will work together as they will be completely integrated. Nevertheless, and big change, workflows will be defined in XAML-only by default. On the other side, more activities will be available in the toolbox.
A lot of improvements in terms of performance and scalability have been done and and a new workflow type is now available : FlowChart. Now, activities can receive arguments and return values.
When developing custom activities, it will be possible to define a custom design using a WPF designed
Basically, and in short, the 4th version of .NET allows a workflow (WF) to be exposed using WCF (interface), while, on the hosting side, it will rely on Dublin which is a Windows Server 2008 Application Role extension.

Wednesday, April 01, 2009 12:57:04 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
TechDays
# Tuesday, March 31, 2009

Once again, I have the pleasure to participate to the TechDays in Geneva at the CICG. This year, not only I will be present at the booth of CTP, but we will be one of the Premium Sponsor of the event and we also have a speaker on stage to talk about Velocity. Before saying too much about this new technology coming out really soon, just note that it will be the distributed cache solution from Microsoft. Be there at the session to know more of that new technology.

Today, during the setup of the booth (by the way, come to us to participate to our Multi-touch contest and see a multi-touch application running), I had the possibility to talk with people from Wygwam and playing with their Microsoft Surface. That is just unbelievable. So much that, to appreciate it, it must be tested... As an example, there was a drum playing game which allows to have a lot of fun. I want one of them :-) At least, it is a good way to have guests every day at home :-)

For tomorrow, a bit of Visual Studio 2010 with the .NET framework 4 and ASP.NET, Geneva - the identity management system, not the city - SQL Data Services, Mesh and WF with Dublin. Sounds promising.

Tuesday, March 31, 2009 11:19:55 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
TechDays
# Thursday, February 12, 2009

I am just coming from the UNIL where I gave a presentation at the Forum Horizon 2009 about the day-to-day work of a computer science engineer. The Forum Horizon is organized by the Office cantonal d'orientation scolaire et professionnelle in order to present the different possibilities of jobs to second year gymnasium students, from the air controller to the scentific police.
It is the 6th year in a row I present this topic and this year I decided to change radically the format of the presentation. Rather than having a bunch of slide with bullet points, I took the option of have big pictures on the background and very few words highlighting the subject. At then end, it was not easy to fing the correct pictures to illustrate the slides, but I think I have been quite successful. The most difficult is to understand what the audience want to know. The goal is to be non-technical and to try to explain clearly what we do as an engineer (and software architect). Moreover, the people in the room still don't know what they want to do. The idea is to help them to make their choice and not necessarily to make them selecting this job.
I posted the presentation here.

Thursday, February 12, 2009 4:04:24 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

# Thursday, January 15, 2009

Summary :
F# is a new language that is coming in the pipe of Microsoft for the Visual Studio platform. It aims to tackle the functional programming paradigm eventhough it is possible to use the imperative or object oriented programming.
Robert Pickering starts his book by explaining the basics of F#, how to get and how to use the tools. Then, the book describes the F# syntax to be used in the three language paradigm, functional programming, imperative programming and finally object oriented programming. Among other things, the notion of type inference is presented. Once the syntax is presented, the book describes the way to develop web, windows or even WPF applications using the .NET framework. Data access is also addressed using the current technologies available, such as ADO.NET or LINQ. Then, a quick look at DSLs, compilation and code generation is given, presenting the lex and yacc tools coming with the language. Finally, a full chapter is dedicated to the interoperability between .NET and F#, because even if F# is based on the CLI, the language introduces several types that are not available in the other .NET languages (C# or VB.NET).

Review :
Discovering a new language is really interesting and with F#, it is the occasion to see a new paragigm, functional programming. In really short, with F#, everything is a value, even a function. It means that you can use a function as a function parameter. The concept of type inference is also very attracting. The book is very easy to understand and a lot of little examples are explained in details, making the reading very fast. The first half of the book is dedicated to the language itself. The second half is more on using the .NET framework and I would say that it is the less interesting of the book. Indeed, during the first part, you have came across various examples using types and classes of the framework and user interface development being web or windows, or data access meaning that the second part does not bring a lot a information. Once you know these topics from the .NET documentation or from another book and once you have read how to access the .NET BCL from F#, then this part is pretty straightforward and not really useful. Moreover, the examples used to depict the topics are more explaining how to use the BCL classes than the language itself. Nevertheless, the last parts discussing the interoperability and the possibility of generating DSLs are more interesting.
My final words are that it is a very intersting book if you want to see another land (functional programming). Unfortunately, on my bookshelf, I also have "Expert F#" that I just opened to see what is inside and I saw that it takes the explanations and descriptions of the language from the beginning. If I had knew that before, maybe I would have bought this one instead. So, if the goal is just to scratch the surface of F#, "Foundations of F#" is the best suited, otherwise, if the goal is to go really deeper in the topic, then prefer "Expert F#" (a review of that one will be posted).

Thursday, January 15, 2009 9:21:33 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
Book Review | Programming | Technical
# Friday, January 09, 2009

Yesterday, the Microsoft's CEO Steve Balmer announced the public availibility of the first beta release of Windows 7. I had a chance to have a quick look at it and the first impression I had was : "it's fast. And ?".

Stop kidding, but if you don't like the Aero style user interface, be prepared to be a bit overloaded. They have put one more layer of Aero and the new glassy toolbar simplifies the application navigation by replacing the multiple application icon by only one and allowing the user to see a preview of the running application when hovering the mouse on the icon.

I played few minutes with it, using minesweeper as well :-) and I like pretty well the user interface, the rapidity and also the fact that it did not crash during the hour I tested it.

A nice little feature is the possibility to display the desktop by just moving the mouse. On the other side, Microsoft says connecting home devices will be really easier than before (home computer). That are only few features that will be in Windows 7. This new version of Windows will probably be released earlier than originally expected in order to try to make people forget about the Windows Vista flop.

Sounds promising...

Friday, January 09, 2009 3:23:25 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

# Tuesday, January 06, 2009

Back to business from some vacations...
First of all, I would like to wish a happy new year and all the best to the readers.

As it is the case every year, people are trying to take resolutions for the new year and I am afraid that I am one of them.
Regarding this blog, I have a couple of points I would like to address this year.

The very first one is to make some cleanup in the blog, such as clearing the spam in the trackbacks and reorganizing the categories.
Then, I would like to upgrade to the last version of the engine, v2.2, released last october.
One goal I would like to achieve this year is to be more active and posting more regularly. Once a week for example. This is going to be quite challenging, I know that already, because the goal is to produce content but not only for the sake of writing stuff.
Finally, and this is something pending since few years now (since I left LooKware in fact), the main web site really needs to be put online. On this side, some work needs to be done to find the right look and feel and also to write the content.

Quite a lot on the plate, and I hope that at the end of 2009, most (all) of the points will be addressed.

Tuesday, January 06, 2009 10:15:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
Blog Life
# Friday, November 14, 2008

<Disclaimer>This is personal notes of what I retained during the session. This can be incomplete, partially right or wrong. It is just  part of the notes I took and what retained my attention. Nothing prevents the user to get more information on their favorite web site.</Disclaimer>

Today, we can extract 7 major trends in the software development process. First, the search becoming a lot more important. Indeed, searching for files, emails and finding a way to organize the information is now crucial. It is the same when we write code. Then, a new user-experience is coming, using new paradigm in the way the development tools are proposed (Rich User Interface). In terms of agility, there is a need for Intellisense and Quick Info. More and more, the develompent is done in a declarative way that allows the develompent tool to do all the plumbing for us. The three remaining trends are the support for legacy code, the Cloud that influence the next steps to adapt Visual Studio, and the concurrency.
During her first demo, Karen Liu showed us the new functionnalities of the QuickSearch that now works accross languages (C#, VB.NET, etc) for types. It offers also a "search as I type" functionnality. It can also be used to search for a file.
With Visual Studio 2010, when selecting a symbol, all references to it are highlighted. But it is done only for the one that have the same signature or type. The user interface, written in WPF, can easily be extended and Karen showed us how all references to a symbol present in a file can be displayed in the margin of Visual Studio.
Adding unit tests is simpler and handled directly by the user interface with an automatic generation of the classes. The Intellisense has now a "filter as you type" feature that speeds up the code writing.
Arriving on a running project can be difficult. Even more when a lot of code is present. The calls dependency is a new feature that allows the developer to see what the code is calling, and which part of the code is calling the one selected.
Another great feature is the historical debugger. Imagine that the runtime hits a breakpoint, it is now possible to come back in the code and to execute step by step the program. In other words, it is a kind of replay. The Functions and Diagnostics Event allows to see what are the events that occured and also the exceptions raised, being caught or not. It is also possible now to record the execution of a program in order to send this to someone to reproduce the scenario.
This session was the last of a great TechEd. Not a lot of things were announced, but the content was interesting and technically advanced for some of them.
Next year, the TechEd will take place in Berlin between the 2nd and the 6th of November 2009 at the Messe Berlin (Germany). Hope to see you there.

Friday, November 14, 2008 10:48:57 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
TechEd2008

<Disclaimer>This is personal notes of what I retained during the session. This can be incomplete, partially right or wrong. It is just  part of the notes I took and what retained my attention. Nothing prevents the user to get more information on their favorite web site.</Disclaimer>

Udi Dahan explains us that today, a lot of books on patterns and practices are existing on the market. But reading them and knowing them by heart does not help if we don't design the application with flexibility in mind. He takes the examples of the Visitor and the Strategy patterns that could easily be overused by architects leading to a collaps of the application structure. The goal here is not the have an absolute flexibility, but to have flexibility where it makes sense and where it is needed. The same phenomenon occured with the use of hooks.
So, Mr. Dahan tells us that we should make the roles explicit and by implementing them with interfaces. Before, when we had an entity that needed to be validated, we implemented a .Validate() method in the entity. That made sense, because only that entity was able to know how to validate itself. But, what happends if in the entity, another one is linked that, in turns, need to be validated. It could be fine if the call sequence was always the same. If it is not, then the trouble comes. Then, the goal is to identify the roles and to make them as interfaces. So, in the case of a validator, an entity templated ("<T entity>") interface should be created and a specific entity validator implementing this interface should be implemented as well, so that a Service Locator can return such entity that will be called for validation.
It is also possible to use that same pattern to differentiate the roles that can be applied to a same entity. Mr. Dahan uses that same pattern to implement message handler.

Friday, November 14, 2008 10:47:18 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
TechEd2008

<Disclaimer>This is personal notes of what I retained during the session. This can be incomplete, partially right or wrong. It is just  part of the notes I took and what retained my attention. Nothing prevents the user to get more information on their favorite web site.</Disclaimer>

The story of Metropolis has been running since a while now and the goal of Metropolis was to draw a parallel between Cities, Transportation, Manufacturing and Applications.
Pat Helland compares IT Shops with Cities, Manufactured Goods with Structured Data and Operations, Retail with Business Processes and Factories and Buildings with Applications. He starts by telling that initially, buildings contained people and stuff, but evolved into a model where bringing in and out stuff and people in addition to connecting them became more and more important. The same with applications where traditionnaly, they were built to interface with people, containing data and performing operations. This is changing in the sense that it integrates more the personal view of work. Moreover, the dissemination of the knowledge increases and the tendancy to directly perform cooperative operations increases as well.
Pat makes the distinction between three kind of buildings and applications : High Road, Low Road and No Road.
On the building side, we can disringuish a High Road one by its facilities to evolve little by little gaining character. They receive investment and they will be adding new extensions or wings. It is the same with High Road Applications, that are typically Line-of-Business applications, requiring a very high availability. We can also call them packaged applications.
The Low Road buildings have a lower cost, but they have no-style and a high turnover. Most often, its inexpensive constructions. On the other side, they are highly adaptable. For applications, we can compare this model to application built by end users, without the need of the IT department. And if the application is no more useful, it can be thrown away. Typically, Excel spreadsheets, Access databases or even SharePoint are considered as Low Road applications by Pat Helland.
Then, it is presented the shearing layers for buildings, such as Stuff, Space Plan, Building Infrastructure, Skin, Structure and Site. Everyone of these layers have their own lifetime, from 60 years for the structure to 5 years for the Space Plan. Again, the same parallel can be done for applications.
The goal, and the conclusion of the presentation, is to leverage the middleware and to build applications for reuse, in order to reduce their costs. This can be done by asking two questions : Who Makes Money ? and Who Saves Money ?
In terms of application component reuse, today, there is not yet a market place. On the Middleware side, the vendors make money and the users save money at the condition that they have to work on a SOA Middleware. For service reuse, it is non-existent today. Moreover, there is a need to standarize on schemas, contracts and branding. Finally, applications are evolving to become services and they should be designed for change.

Friday, November 14, 2008 10:45:07 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
TechEd2008

<Disclaimer>This is personal notes of what I retained during the session. This can be incomplete, partially right or wrong. It is just  part of the notes I took and what retained my attention. Nothing prevents the user to get more information on their favorite web site.</Disclaimer>

Basically, S+S is about externalizing services like we have done with the power. Instead of having every home producing power, we have now plants that are producting it, and home connecting to the grid to get what they need. It is more or less the same with the Cloud. Someone is hosting the resources for you, no need to care about scalability, failover and so on, letting you concentrate on the development of your application. This allows also to publish your own services into the Cloud, making them available for others. A parallel can be done with the transport systems :
A car corresponds to the on-premises infrastructure. But it has a maintenance cost and needs to be fixed or patched.
Car renting is better and is like hosting.
On its side, the train is equivalent to the Cloud. No need to care about the maintenance at all, but, the downside is that you can not change the schedule or where it goes.
So, when looking for the Cloud, it is looking for availability, scaling, but you have no control on it. It also means that the Cloud is not the silver bullet and is not for everything.
To manage the identity, the .NET Services (one of the Cloud services) relies on external authorities. The enterprise defines the identities and roles and build a trust relationship with an external authority that will be trust by the Cloud. It means also that .NET Services needs to support several technologies.
It leads to, at least, two challenges : to focus on the use of SOA, and on resources decentralization.
To support the ID management, Cloud is using tools such as claims, tokens and Security Token Services.
Finally, to control the identities on .NET Services, there is a MMC to manage them.

Friday, November 14, 2008 10:43:05 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
TechEd2008
Google Cloud Platform Certified Professional Cloud Architect
Ranked #1 as
French-speaking SharePoint
Community Influencer 2013
Navigation
Currently Reading :
I was there :
I was there :
I was exhibiting at :
I was there :
I was a speaker at :
I was a speaker at :
I was a speaker at
(January 2013 session):
I was a speaker at :
I was a speaker at :
United Nations (UN) SharePoint Event 2011
I was a speaker at :
I was there !
I was there !
I was there !
I was there !
Archive
<April 2009>
SunMonTueWedThuFriSat
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789
About the author/Disclaimer

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2020
Yves Peneveyre
Sign In
Statistics
Total Posts: 289
This Year: 1
This Month: 0
This Week: 0
Comments: 19
Themes
Pick a theme:
All Content © 2020, Yves Peneveyre
DasBlog theme 'Business' created by Christoph De Baene (delarou)