25 Oct 2014

feedPlanet Identity

Anil John: An Overview of Canada's C2G Identity Services

A high level technical overview of the components of Canada's Cyber Authentication Renewal Initiative

25 Oct 2014 1:30pm GMT

Mike Jones - Microsoft: JOSE -36 and JWT -30 drafts addressing additional IESG review comments

These JOSE and JWT drafts incorporate resolutions to some previously unresolved IESG comments. The primary change was adding flattened JSON Serialization syntax for the single digital signature/MAC and single recipient cases. See http://tools.ietf.org/html/draft-ietf-jose-json-web-signature-36#appendix-A.7 and http://tools.ietf.org/html/draft-ietf-jose-json-web-encryption-36#appendix-A.5 for examples. See the history entries for details on the few other changes. No breaking changes were made. The specifications […]

25 Oct 2014 6:28am GMT

24 Oct 2014

feedPlanet Identity

Radovan Semančík - nLight: How to Start Up an Open Source Company

Evolveum is a successful open source company now. We develop open source Identity and Access Management (IAM) software. We have legally established Evolveum in 2011 but the origins of Evolveum date back to mid-2000s. In 2014 we are getting out of the startup stage into a sustainable stage. But it was a long way to get there. I would like to share our experiences and insights in a hope that this will help other is their attempts to establish an open source business.

The basic rules of the early game are these: It all starts with an idea. Of course. You need to figure out something that is not yet there and people need it. That is the easy part. Then you have to prototype it. Even the brightest idea is almost worthless until it is implemented. You have to spend your own time and money to implement the prototype. You cannot expect anyone to commit the time or money to your project until you have a working prototype. So prepare for that - psychologically and financially. You will have to commit your savings, secure a loan or sacrifice your evenings and nights for approx. 6 months. If you spend less than that then the prototype is unlikely to be good enough to impress others. And you cannot have a really successful project without the support of others (I will get to that). Make sure that the prototype has proper open source license from day one, that it is published very early and that it follows open source best practice. Any attempt to twist and bend these rules is likely to backfire when you are at the most vulnerable.

Then comes the interesting part. Now you have two options. The choice that you make now will determine the future of your company for good. Therefore be very careful here. The options are:

Fast growth: Find an investor. This is actually very easy to do if you have a good idea, good prototype and you are a good leader. Investors are hungry for such start-up companies. You have to impress the investor. This is the reason why the prototype has to be good. If you started alone find an angel investor. If you already have a small team find a venture capitalist. They will give you money to grow the company. Quite a lot of money actually. But there is a catch. Or better to say a whole bunch of them. Firstly, the investor will take majority of shares in your company in exchange for the money. You will have to give away the control over the company. Secondly, you will need to bind yourself to the company for several years (this may be as long as 10 years in total sometimes). Which means you cannot leave without losing almost everything. And thirdly and most importantly: you must be able to declare that your company can grow at least ten times as big in less than three years. Which means that your idea must be super-bright ingenious thingy that really everyone desperately needs and it also needs to be cheap to produce, easy to sell and well-timed - which is obviously quite unlikely. Or you must be inflating the bubble. Be prepared that a good part of the investment will be burned to fuel marketing, not technology. You will most likely start paying attention to business issues and there will be no time left to play with the technology any more. Also be prepared that your company is likely to be sold to some mega-corporation if it happens to be successful - with you still inside the company and business handcuffs still on your hands. You will get your money in the end, but you will have almost no control over the company or the product.

Self-funded growth: Find more people like you. Show them the prototype and persuade them to work together. Let these people become your partners. They will get company shares in exchange of their work and/or money that they invest in the company. The financiers have a very fitting description for this kind of investment: FFF which means Friends, Family and Fools. This is the reason for the prototype to be good. You have to persuade people like you to sacrifice an arm and a leg to your project. They have to really believe in it. Use these FFF money to make a product out of your early prototype. This will take at least 1-2 years and there will be almost no income. Therefore prepare the money for this. Once you are past that state the crucial part comes: use your product to generate income. No, not sell the support or subscription or whatever. This is not going to work at this stage. Nobody will pay enough money for the product until it is well known and proven in practice. You have to use the product yourself. You have to eat your own dogfood. You have to capitalize on the benefits that the product brings, not on the product itself. Does your product provide some service? Set up a SaaS and provide the service for money. Sell your professional services and mix in your product as additional benefit. Sell a solution that contains your product. Does your product improve something (performance, efficiency)? Team up with the company that does this "something" and agree on sharing the revenue or savings generated by your product. And so on. You have to bring your own skin to the game. Use the early income to sustain product development. Do not expect any profit yet. Also spend some money and time on marketing. But most of the money still need to go to the technology. If the product works well then it will eventually attract attention. And then, only then, you will get enough money from subscriptions to fund the development and make profit. Be prepared that it can take 3-6 years to get to this stage. And a couple more years to repay your initial investment. This is a slow and patient business. In the end you will retain your control (or significant influence) over the product and company. But it is unlikely to ever make you a billionaire. Yet, it can make a decent living for you and your partners.

Theoretically there is also a middle way. But that depends on a reasonable investor. An investor that cares much more about the technology than he cares about money, valuations and market trends. And it this is extremely rare breed. You can also try crowdfunding. But this seems to work well only for popular consumer products that are not very common in the open source world. Therefore it looks like your practical options are either bubble or struggle.

And here is a couple of extra tips: Do not start with all-engineer team. You need at least one business person in the team. Someone that can sell your product or services. Someone that can actually operate the company. You also need one visionary in the team. Whatever approach you choose it is likely that your company reaches full potential in 8-12 years. Not any earlier. If you design your project just for the needs of today you are very likely to end up with an obsolete product before you can even capitalize on it. You also need a person that has his feet stable on the ground. The product needs to start working almost from the day one otherwise you will not be able to gain the momentum. Balancing the vision and the reality is the tough task. Also be prepared to rework parts of your system all the time. No design is ever perfect. Ignoring the refactoring needs and just sprint for the features will inevitably lead to development dead-end. You cannot afford that. That ruins all your investment.The software is never done. Software development never really ends. If it does end then the product itself is essentially dead. Plan for continuous and sustainable development pace during the entire lifetime of your company. Do not copy any existing product. Especially not other open source product. It is pointless. The existing product will always have a huge head start and you cannot realistically ever make that up unless the original team makes some huge mistake. If you need to do something similar than other project already does then team up with them. Or make a fork and start from that. If you really start on a green field you have to use a very unique approach to justify your very existence.

I really wish that someone explained this to me five years ago. We have chosen to follow the self-funded way of course. But we had to explore many business dead-ends to get there. It was not easy. But here we are, alive and well. Good times are ahead. And I hope that this description helps other teams that are just starting their companies. I wish them to have a much smoother start than we had.

(Reposted from https://www.evolveum.com/start-open-source-company/)

24 Oct 2014 1:23pm GMT

Kuppinger Cole: Executive View: SAP Audit Management - 71162

In KuppingerCole

Audits are a must for any organization. The massively growing number of ever-tighter regulations in the past years and the overall growing relevance and enforcement of Corporate Governance and, as part of it, Risk Management, has led to an increase in both the number and complexity of audits. These audits affect all areas of an organization, in particular the business departments and IT.


more

24 Oct 2014 1:15pm GMT

WAYF News: Aalborg Business College now part of WAYF

Aalborg Business College joined WAYF today. Its students and employees thus now have the ability to log into WAYF-supporting web services using their familiar institutional accounts.

24 Oct 2014 11:45am GMT

Kuppinger Cole: Amazon opens data center in Germany

In Martin Kuppinger

Today, AWS (Amazon Web Services) announced the opening of their new region, located in Frankfurt, Germany. The new facilities actually contain two availability zones, i.e. at least two distinct data centers. AWS can now provide a local solution to customers in mainland Europe, located close to one of the most important Internet hubs. While on one hand this is important from a technical perspective (for instance, with respect to potential latency issues), it is also an important move from a compliance perspective. The uncertainty many customers feel regarding data protection laws in countries such as Germany, as well as the strictness of these regulations, is a major inhibitor preventing the rapid adoption of cloud services.

Having a region in Germany is interesting not only for German customers, but also for cloud customers from other EU countries. AWS claims that, since they provide a pure-play IaaS (Infrastructure as a Service) cloud service that just provides the infrastructure on which the VMs (virtual machines) and customers' applications reside, their customers have full control over their own data, especially since the AWS Cloud HSM allows the customers to hold their encryption keys securely in the cloud. This service relies on FIPS 140-2 certified hardware and is completely managed by the customer via a secure protocol. Notably, the customer can decide on where his data resides. AWS does not move customer data outside of the region where the customer places it. With the new region, a customer can design a high availability infrastructure within the EU, i.e. Germany and Ireland.

KuppingerCole strongly recommends that customers encrypt their data in the cloud in a way that allows them to retain control over their keys. However, it must be remembered that the responsibility of the data controller stretches from end to end. It is not simply limited to protecting the data held on a cloud server; it must cover the on-premise, network, and end user devices. The cloud service provider (AWS) is responsible for some but not all of this. The data controller needs to be clear about this division of responsibilities and take actions to secure the whole process, which may involve several parties.

Clearly, all this depends on which services the customers are using and the specific use they make of them. Amazon provides comprehensive information around the data compliance issues; additional information around compliance with specific German Laws is also provided (in German). The AWS approach should allow customers to meet the requirements regarding the geographical location of data and, based on the possession of keys, keep it beyond control of foreign law enforcement. However, there is still a grey area: Amazon operates the hardware infrastructure and hypervisors. There was no information available regarding where the management of this infrastructure is located, whether it is fully done from the German data center, 24×7, or whether there is a follow-the-sun or another remote management approach.

Cloud services offer many potential benefits for organizations. These include flexibility to quickly grow and shrink capacity on demand and to avoid costly hoarding of internal IT capacity. In many cases, technical security measures provided by a cloud service provider exceed those provided on-premise, and the factors inhibiting a move to cloud services are more psychological than technical. However, any business needs to be careful to avoid becoming wholly dependent upon one single supplier.

In sum, this move by Amazon reduces the factors inhibiting German and other European customers from moving to the cloud, at least at the IaaS level. For software companies from outside of the EU offering their solutions based on the AWS infrastructure as cloud services, there is less of a change. Moving up the stack towards SaaS, the questions of who is doing data processing, who is in control of data, or whether foreign law enforcement might bypass European data regulations, are becoming more complex.

Hence, we strongly recommend customers to use a standardized risk-based approach for selecting cloud service providers and ensure that this approach is approved by their legal departments and auditors. While the recent Amazon announcement reduces inhibitors, the legal aspects of moving to the cloud (or not) still require thorough analysis involving experts from both the IT and legal/audit side.

More information on what to consider when moving to the cloud is provided in Advisory Note: Selecting your cloud provider - 70742 and Advisory Note: Security Organization, Governance, and the Cloud - 71151.

24 Oct 2014 7:37am GMT

23 Oct 2014

feedPlanet Identity

Kuppinger Cole: Big News from the FIDO Alliance

In Alexei Balaganski

FIDO Alliance (where FIDO stands for Fast IDentity Online) is an industry consortium formed in July 2012 with a goal to address the lack of interoperability among various strong authentication devices. Currently among its members are various strong authentication solution vendors (such as RSA, Nok Nok Labs or Yubico), payment providers (VISA, MasterCard, PayPal, Alibaba), as well as IT industry giants like Microsoft and Google. The mission of the FIDO Alliance has been to reduce reliance on passwords for authentication and to develop specifications for open, scalable and interoperable strong authentication mechanisms.

KuppingerCole has been closely following the progress of FIDO Alliance's developments for the last couple of years. Initially Martin Kuppinger has been somewhat skeptical about the alliance's chances to gain enough support and acceptance among the vendors. However, seeing how many new members were joining the alliance, as well as announcements like the first FIDO authentication deployment by PayPal and Samsung earlier this year would confirm their dedication to lead a paradigm shift in the current authentication landscape. It's not just about getting rid of passwords, but about giving users the opportunity to rely on their own personal digital identities, potentially bringing to an end the current rule of social logins.

After years of collaboration, Universal Authentication Framework and Universal 2nd Factor specifications have been made public in October 2014. This has been closely followed by several announcements from different Alliance members, unveiling their products and solutions implementing the new FIDO U2F standard.

One that definitely made the biggest splash is, of course, Google's announcement of strengthening their existing 2-step verification with a hardware-based second factor, the Security Key. Although Google has been a strong proponent of multifactor authentication for years, their existing infrastructure is based on one-time codes sent to users' mobile devices. Such schemes are known to be prone to various attacks and cannot protect users from falling victim to a phishing attack.

The Secure Key (which is a physical USB device manufactured by Yubico) enables much stronger verification based on cryptographic algorithms. This also means that each service has its own cryptographic key, meaning that users can reliably tell a real Google website from a fake one. Surely, this first deployment based on a USB device has its deficiencies as well, for example, it won't work on current mobile devices, since they all lack a suitable USB port. However, since the solution is based on a standard, it's expected to work with any compatible authentication devices or software solutions from other alliance members.

Currently, U2F support is available only in Google Chrome browser, but since the standard is backed by such a large number of vendors including major players like Microsoft or Salesforce, I am sure that other browsers will follow soon. Another big advantage of an established standard is availability of libraries to enable quick inclusion of U2F support into existing client applications and websites. Yubico, for example, provides a set of libraries for different languages. Google offers open source reference code for U2F specification as well.

In a sense, this first U2F large-scale deployment by Google is just the first step in a long journey towards the ultimate goal of getting rid of passwords completely. But it looks like a large group sharing the same vision has much more chances to reach that goal earlier that anybody planning to walk all the way alone.

23 Oct 2014 1:25pm GMT

22 Oct 2014

feedPlanet Identity

Brad Tumy - Oracle: Resetting Forgotten Passwords with @ForgeRock #OpenAM

Implementing the "Resetting Forgotten Passwords" functionality as described in the OpenAM Developer's Guide requires some additional custom code. It's pretty straight forward to implement this functionality and can be done in 4 steps (per the Developer's Guide): Configure the Email Service Perform an HTTP Post with the user's id OpenAM looks up email address (based on […]

22 Oct 2014 11:45pm GMT

Julian Bond: I want to buy a final edition 7th gen iPod Classic 160 with the v2.0.5 firmware in the UK. I don't mind...

I want to buy a final edition 7th gen iPod Classic 160 with the v2.0.5 firmware in the UK. I don't mind a few scratches as long as the display is still ok. Anyone?

I need a 7th generation Classic 160 because this went back to a single platter drive and there's a 240Gb disk that fits and works. The previous 6th Gen 160 (which I have) used a dual platter drive with an unusual interface and can't be upgraded.

About 2 years ago, the final 7th Gen Classic iPod was upgraded with slightly different hardware that worked with the final v2.0.5 firmware. If it came with 2.0.4 then it probably can't be upgraded. 2.0.5 is desirable because there's a software setting to disable the EU volume limit. When Apple did all this, they didn't actually update the product codes or SKU# So People will claim they have a 7th Gen MC297QB/A or MC297LL/A and it might or might not be the right one. The only way to be sure is to try and update to 2.0.5

So at the moment I'm chasing several on eBay but having to wait for the sellers to confirm what they're actually selling before putting in bids and losing out. Apparently I'm not alone as prices are rising. The few remaining brand new ones are quoted on "Buy Now" prices at a premium, sometimes twice the final RRP. Gasp!

I f***ing hate Apple for playing all these games. I hate them for discontinuing the Classic 160. I hate that there's no real alternative.

1st world problems, eh? It seems like just recently I keep running up against this. I'm constantly off balance because things I thought were sorted and worked OK, are no longer available. Or the company's gone bust or been taken over. Or the product has been updated and what was good is now rubbish. Or the product is OK, but nobody actually stocks the whole range so you have to buy it on trust over the net.
[from: Google+ Posts]

22 Oct 2014 2:31pm GMT

Julian Bond: Aphex Twin leaking a fake version of Syro a few weeks before the official release was genius. It's spread...

Aphex Twin leaking a fake version of Syro a few weeks before the official release was genius. It's spread all over the file sharing sites so it's hard to find the real release. The file names match[1]. The music is believeable but deliberately lacks lustre. It's really a brilliant pastiche of an Aphex Twin album as if some Russian producer has gone out of their way to make an homage example of what they thought Aphex Twin was doing.

http://www.electronicbeats.net/en/features/reviews/the-fake-aphex-twin-leak-is-a-hyperreal-conundrum/

[1] What's a bit weird though is that the MP3 files not only have the same filenames but seem to be the same size and have the same checksum.
 EB Reviews: Fake 'Syro' â€" Electronic Beats »
Am I one of those people who can't tell the difference between authentic Aphex tunes and sneering knockoffs?

[from: Google+ Posts]

22 Oct 2014 8:55am GMT

21 Oct 2014

feedPlanet Identity

Courion: Data Breach? Just Tell It Like It Is

Access Risk Management Blog | Courion

Wired Innovation InsightsKurt Johnson, Vice President of Strategy and Corporate Development, has posted a blog on Wired Innovations Insight titled, Data Breach? Just Tell It Like It Is.

In the post, Kurt discusses the negative PR implications of delayed breach disclosure and recommends improving your breach deterrence and detection capabilities by continuously monitoring identity and access activity for anomalous patterns and problems, such as orphan accounts, duties that need to be segregated, ill-conceived provisioning or just unusual activity.

Read the full post now.

blog.courion.com

21 Oct 2014 10:25pm GMT

Neil Wilson - UnboundID: UnboundID LDAP SDK for Java 2.3.7

We have just released the 2.3.7 version of the UnboundID LDAP SDK for Java. You can get the latest release online at the UnboundID Website, the SourceForge project page, or in the Maven Central Repository.

Complete release note information is available online at on the UnboundID website, but some of the most significant changes include:

  • Updated the logic used to select the TLS protocols to use for secure communication. SSLv3 is now disabled by default in response to the recent POODLE bug. On IBM JVMs, the set of enabled TLS protocols should be more broadly compatible with earlier TLS versions when support for TLSv1.1 or TLSv1.2 is enabled.

  • Added the ability to perform improved SSLSocket validation, which makes it possible to perform certificate hostname validation in a more secure and convenient manner than was previously available. It is also now possible to get access to the SSL session associated with a connection secured via SSL/TLS or StartTLS.

  • Added a new server set that can work in a DNS round-robin configuration, in which multiple IP addresses are associated with the same resolvable name.

  • Added an LDAPConnectionPool.shrinkPool method that can be used to reduce the number of currently-available connections to a specified number.

  • Improved support for class inheritance in the LDAP SDK persistence framework. If one class marked with @LDAPObject is a subclass of another class marked with @LDAPObject, then the logic used to construct the entry's DN for instances of the subclass may be inherited from the superclass. Also, DN fields and entry fields will be properly handled in subclasses, and improvements have been made in requesting specific attributes to include in search result entries.

  • Added a new interceptor API to the in-memory directory server. This API can be used to alter or reject an LDAP request before it is processed by the server, to alter or suppress search result entries or references, and to alter LDAP results before they are returned to the client.

  • Updated the searchrate, modrate, authrate, and search-and-mod-rate tools to support altering the rate at which they process operations over time. Also, update these tools to make it possible to programmatically interrupt their processing.

  • Improved support for automatic referral following and auto-reconnect to ensure that the newly-established connection will properly use StartTLS if the original connection had successfully used StartTLS to secure its communication.

  • Fixed a bug in the Entry.applyModifications method that could cause valid modifications to be rejected if those modifications targeted attributes used in the entry's DN but would not actually have resulted in a change to the entry DN.

  • Fixed a bug in the Entry.diff method in which, if provided with a specific set of attributes to examine, the method would not examine variants of those attributes containing attribute options.

  • Fixed a potential null pointer exception that could arise as a result of a race condition in the course of closing a connection. Fixed a potential illegal argument exception that could arise when closing a connection pool if multiple concurrent threads were used to close the connections but no connections were currently available in the pool.

21 Oct 2014 5:32pm GMT

OpenID.net: Notice of Vote for Errata to OpenID Connect Specifications

The official voting period will be between Friday, October 31 and Friday, November 7, 2014, following the 45 day review of the specifications. For the convenience of members, voting will actually open a week before Friday, October 31 on Friday, October 24 for members who have completed their reviews by then, with the voting period [...]

21 Oct 2014 5:43am GMT

OpenID.net: Notice of Vote for Implementer’s Draft of OpenID 2.0 to OpenID Connect Migration Specification

The official voting period will be between Friday, October 31 and Friday, November 7, 2014, following the 45 day review of the specification. For the convenience of members, voting will actually open a week before Friday, October 31 on Friday, October 24 for members who have completed their reviews by then, with the voting period [...]

21 Oct 2014 5:38am GMT

20 Oct 2014

feedPlanet Identity

Radovan Semančík - nLight: Project Provisioning with midPoint

Evolveum midPoint is a very unique Identity Management (IDM) system. MidPoint is a robust open source provisioning solution. Being an open source the midPoint is developed in a fairly rapid, incremental and iterative fashion. And the recent version introduced a capability that allows midPoint to reach beyond the traditional realm of identity management.

Of course, midPoint is great in managing and synchronizing all types of identities: employees, contractors, temporary workers, customers, prospects, students, volunteers - you name it, midPoint does it. MidPoint can also manage and synchronize functional organizational structure: divisions, departments, sections, etc. Even though midPoint does it better than most other IDM systems these features are not exactly unique just by themselves. What is unique about midPoint is that midPoint refines these mechanisms into generic and reusable concepts. MidPoint mechanisms are carefully designed to work together. This makes midPoint much more than just a sum of its parts.

One interesting consequence of this development approach is unique ability to provision projects. We all know the new lean project-oriented enterprises. The importance of traditional tree-like functional organizational structure is diminished and flat project-based organizational structure takes the lead. Projects govern almost any aspect of the company life from development of a new groundbreaking product to the refurbishing an office space. The projects are created, modified and closed almost on a daily basis. This is usually done manually by system administrators: create a shared folder on a file server, set up proper access control lists, create a distribution list, add members, create new project group in Active Directory, add members, create an entry in the task tracking system, bind it with the just-created group in Active Directory ... It all takes a couple of days or weeks to be done. This is not very lean, is it?

MidPoint can easily automate this process. Projects are yet another type of organizational units that midPoint manages. MidPoint can maintain an arbitrary number of parallel organizational structures. Therefore adding an orthogonal project-based structure to existing midPoint deployment is a piece of cake. MidPoint also supports membership of a single user in arbitrary number of organizational units therefore this efficiently creates a matrix organizational structure. As midPoint projects are just organizational units they can easily be synchronized with other systems. MidPoint can be configured to automatically create proper groups, distribution lists and entries in the target systems. And as midPoint knows who are the members of the project it can also automatically add correct accounts to the groups it has just created.

Project provisioning

However, this example is just too easy. MidPoint can do much more. And anyway, the modern leading-edge lean progressive organizations are not only project-based but also customer-oriented. The usual requirement is not only to support internal project but especially the customer-facing projects. Therefore I have prepared a midPoint configuration that illustrates automated provisioning of such customer projects.

The following screenshot illustrates the organizational structure maintained in midPoint. It shows a project named Advanced World Domination Program (or AWDP for short). The project is implemented for a customer ACME, Inc.. The project members are jack and will. You can also see another customer and a couple of other projects there. The tabs also shows different (parallel) organizational structures maintained by midPoint. But now we only care about the Customers structure.

Project provisioning: midPoint

MidPoint is configured to replicate the customer organizational unit to LDAP. Therefore it will create entry ou=ACME,ou=customers,dc=example,dc=com. It is also configured to synchronize the project organizational unit to the LDAP as an ldap group. Therefore it will create LDAP entry cn=AWDP,ou=ACME,ou=customers,dc=example,dc=com with a proper groupOfNames object class. MidPoint is also configured to translate project membership in its own database into a group membership in LDAP. Therefore the cn=AWDP,... group contains members uid=jack,... and uid=will,....

Project provisioning: LDAP

Similar configuration is used to synchronize the projects to GitLab. The customers are transformed to GitLab groups and GitLab projects are created in accord with midPoint projects.

Project provisioning: GitLab

... and project members are correctly set up:

Project provisioning: GitLab

All of this is achieved by using a relatively simple configuration. The configuration consist of just four XML files: one file to define access to each resource, one file to define customer meta-role and one for customer project meta-role. No custom code is used except for a couple of simple scriptlets that count just several lines of Groovy. Therefore this configuration is easily maintainable and upgradeable.

And midPoint can do even more still. As midPoint has a built-in workflow engine it can easily handle project membership requests and approvals. MidPoint has a rich delegated administration capability therefore management of project information can be delegated to project managers. MidPoint synchronization is bi-directional and flexible one system can be authoritative for some part of project data (e.g. scope and description) and another system for a different type of data (e.g. membership). And so on. The possibilities are countless.

This feature is must have for any lean project-oriented organization. Managing projects manually is the 20th-century way of doing things and it is a waste of precious resources. Finally midPoint is here to bring project provisioning into the 21st century. And it does it cleanly, elegantly and efficiently.

(Reposted from https://www.evolveum.com/project-provisioning-midpoint/)

20 Oct 2014 1:09pm GMT

Vittorio Bertocci - Microsoft: TechEd Europe, and a Quick Jaunt in UK & Netherlands

I can't believe TechEd Europe is already a mere week away!

This year I have just 1 session, but TONS of new stuff to cover… including news we didn't break yet! 75 mins will be very tight, but I'll do my best to fit everything in. If you want to put it [...]

20 Oct 2014 8:07am GMT