30 Mar 2015

feedPlanet Identity

Gluu: Gluu Server Training in San Francisco, CA

After RSA Security Conference on Wednesday, April 22, join Gluu CEO Mike Schwartz at WeWork SOMA for a hands on training session exploring how to use the Gluu Server to secure web and mobile applications. This workshop will cover how to deploy the Gluu Server on a fresh VM, how to configure single sign-on (SSO) … Read more >>

30 Mar 2015 9:18pm GMT

Axel Nennker: New Firefox Add-On: QRCode Login

Current login mechanisms suffer from missing support by browsers and sites.
Browsers offer in-browser password storage but that's about that.
Standardized authentication methods like HTTP Digest Authentication and HTTP Basic Authentication were never really accepted by commercially successful sites. They work but the user experience is bad especially if the user does not have an account yet.

So most sites are left with form-based authentication were the site has full control over the UI and UX. Sadly the browser has little to offer here to help the site or the user other then trying to identify signup and login forms through crude guesses based on password field existence.

There is no standardized way for sites and browsers to work together.
Here is a list of attempts to solve some of the above issues:

Federations have their drawbacks too. Even Facebook login went dark for 4h a while ago which left sites depending on Facebook without user login.

In general there is this chicken-egg problem:
Why should sites support new-mechanism-foo when there is no browser support.
Why should browsers support new-mechanism-foo when there are no sites using it.

Then there are password stores. I use passwordsafe to store my password in one place. If I do not have access to that place (PC) then I can't login. Bummer.
Others use stores hosted on the Internet and those usually support most browsers and OSses through plugin/addons and non standard trickery.
I never could convince myself to trust the providers.

So. Drum-roll.
I started to work on a mechanism that has a password store on the mobile which allows you to login on your PC using your PC's camera.

The user story is as follows:

  1. browse to a site's login page e.g. https://github.com/login
  2. have my Firefox addon installed
    https://github.com/AxelNennker/qrcodelogin
  3. click on the addon's icon
  4. present your credential-qrcode to the PC's camera
  5. be logged in

Here is an example qrcode containing the credentials as a JSON array
["axel@nennker.de","password"]:


The qrcode could be printed on paper or generated by your password store on your mobile. To help the user with the selection of the matching credentials the addon presents a request-qrcode to be read by the mobile first. This way the mobile ID-client can select the matching credentials.
(If you don't like to install addons to test this and for a super quick demo of the qrcode reading using your webcam please to to http://axel.nennker.de/gum.html and scan a code)

What are the benefits?

What are the drawbacks?


Screenshots:

Login page at githup with addon installed:



Screen after pressing the addon's toolbar icon. The qrcode helps the mobile ID-client to find the matching credentials:

Screen showing the camera picture which is scanned for qrcodes:

This is clearly only a first step but I believe that it has potential to be a true user-centric solution that helps me and you to handle the password mess.









30 Mar 2015 1:52pm GMT

28 Mar 2015

feedPlanet Identity

Bill Nelson - Easy Identity: OpenDJ and the Fine Art of Impersonation

Directory servers are often used in multi-tier applications to store user profiles, preferences, or other information useful to the application. Oftentimes the web application includes an administrative console to assist in the management of that data; allowing operations such as user creation or password reset. Multi-tier environments pose a challenge, however, as it is […]

28 Mar 2015 1:55pm GMT

27 Mar 2015

feedPlanet Identity

Julian Bond: Avast thah, me hearties!

Avast thah, me hearties!

Google have a new auto-proxy service that speeds up unencrypted web pages by compressing them on Google's proxy servers between the website and your device.
https://support.google.com/chrome/answer/2392284?p=data_saver_on&rd=1

This has an unintended but hilarious side effect. There's a bunch of websites that are blocked by UK ISPs for copyright issues. So if you go to, for instance, http://newalbumreleases.net/ you will normally be blocked by a Virgin Media/BT/TalkTalk warning message. But if you have Google's data saving Chrome extension installed it acts like a VPN and side steps the block.

Then there's https://thepiratebay.se/ They've sucessfully implemented an https:// scheme that also side steps the same UK ISP block.

For the moment, we're saved. But stand by to repel boarders!
 Reduce data usage with Chrome’s Data Saver - Chrome Help »

[from: Google+ Posts]

27 Mar 2015 9:22am GMT

26 Mar 2015

feedPlanet Identity

Paul Madsen: NAPPS - a rainbow of flavours

Below is an arguably unnecessarily vibrant swimlane of the proposed (Native Appplications) NAPPS flow for an enterprise built native application calling an on-prem API.

The very bottom arrow of the flow (that from Ent_App to Ent_RS) is the actual API call that, if successful will return the business data back to the native app. That call is what we are trying to enable (with all the rainbow hued exchanges above)

As per normal OAuth, the native application authenticates to the RS/API by including an access token (AT). Also show is the possibility of the native application demonstrating proof of possession for that token but I'll not touch on that here other than to say the corresponding spec work is underway).

What differs in a NAPPS flow is how the native application obtains that access token. Rather than the app itself taking the user through an authentication & authorization flow (typically via the system browser), the app gets its access token via the efforts of an on-device 'Token Agent' (TA).

Rather than requesting an access token of a network Authorization Service (as in OAuth or Connect), the app logically makes its request of the TA - as labelled below as 'code Request + PKSE'. Upon receiving such a request from an app, the TA will endeavour to obtain from the Ent_AS an access token for the native app. This step is shown in green below. The TA uses a token it had previously obtained from the AS in order to obtain a new token for the app.

In fact, what the TA obtains is not the access token itself, but an identity token (as defined by Connect) that can be exchanged by the app for the more fundamental access token - as shown in pink below. While this may seem like an unnecessary step, it actually

  1. mirrors how normal OAuth works, in which the native app obtains an authz code and then exchanges that for the access token (this having some desirable security characteristics)
  2. allows the same pattern to be used for a SaaS app, ie one whether there is another AS in the mix and we need a means to federate identities across the policy domains.





When I previously wrote 'TA uses a token it had previously obtained from the AS', I was referring to the flow coloured in light blue above. This is a pretty generic OAuth flow , the only novelty is the introduction of the PKSE mechanism to protect against a malicious app stealing tokens by sitting on the app's custom URL scheme.



26 Mar 2015 8:22pm GMT

Kantara Initiative: UMA V1.0 Approved as Kantara Recommendation

Congratulations to the UMA Work Group on this milestone! The User-Managed Access (UMA) Version 1.0 specifications have been finalized as Kantara Initiative Recommendations, the highest level of technical standardization Kantara Initiative can award. UMA has been developed over the last several years by industry leaders in our UMA Work Group. Check out the UMA WG […]

26 Mar 2015 5:36pm GMT

25 Mar 2015

feedPlanet Identity

Gluu: UMA 1.0 Approved by Unanimous Vote!

This week voting member organizations at the Kantara Initiative unanimously approved the User Managed Access (UMA) 1.0 specification, a new standard profile of OAuth2 for delegated web authorization. More than half of the member organizations were accounted for on the vote to reach quorum and provide the support needed for approval. The unanimous approval of … Read more >>

25 Mar 2015 6:08pm GMT

Ben Laurie - Apache / The Bunker: cheap jordans for sale 1jS5 2015325

heart, everyone knows that, today, things can not be good,cheap jordans for sale, I am afraid of. Soul jade face,cheap Authentic jordans, when Xiao Yan threw three words, and finally is completely chill down, he stared at the latter, after a moment, slowly nodded his head and said: 'So it would be only First you […]

25 Mar 2015 4:42pm GMT

Ben Laurie - Apache / The Bunker: Cheap Jordan Shoes 1fF6 2015325

body paint on the ground slippery ten meters, just stop, just stop its stature, two He is rushed to the guard house,Cheap Jordan Shoes, grabbed him, severely The throw back. 'give you a chance to say,cheap jordans for sale, I can let you go.' He and everyone on the main palm Xiupao swabbing a bit […]

25 Mar 2015 4:41pm GMT

Ben Laurie - Apache / The Bunker: cheap jordans for sale 2dW1 2015325

obtaining the body shocked,cheap jordans for sale, arm place, it is faintly heard between Ma Xia Bi feeling. 'Damn, this puppet refining what is? physical force actually so horrible!' arm uploaded to feel pain, Shen Yun also could not help but thrown touch the hearts of dismay. 'Xiao Yan,Cheap Retro Air Jordan, brisk walking, do […]

25 Mar 2015 4:40pm GMT

Christopher Allen - Alacrity: 10 Design Principles for Governing the Commons

In 2009, Elinor Ostrom received the Nobel Prize in Economics for her "analysis of economic governance, especially the commons". Since then I've seen a number of different versions of her list of the 8 principles for effectively managing against the...

25 Mar 2015 3:55am GMT

24 Mar 2015

feedPlanet Identity

Kantara Initiative: Kantara Initiative grants Scott S. Perry CPA, PLLC Accredited Assessor Trustmark at Assurance Levels 1, 2, 3 and 4

PISCATAWAY, NJ- (24 March, 2015) - Kantara Initiative is proud to announce that Scott S. Perry CPA, PLLC is now a Kantara-Accredited Assessor with the ability to perform Kantara Service Assessments at Assurance Levels 1, 2, 3 and 4. Scott S. Perry CPA, PLLC is approved to perform Kantara Assessments in the jurisdictions of USA, […]

24 Mar 2015 6:20pm GMT

Radovan Semančík - nLight: Comparing Disasters

A month ago I have described my disappointment with OpenAM. My rant obviously attracted some attention in one way or another. But perhaps the best reaction came from Bill Nelson. Bill does not agree with me. Quite the contrary. And he has some good points that I can somehow agree with. But I cannot agree with everything that Bill points out and I still think that OpenAM is a bad product. I'm not going to discuss each and every point of Bill's blog. I would summarize it like this: if you build on shabby foundation your house will inevitably turn to rubble sooner or later. If a software system cannot be efficiently refactored it is as good as dead.

However this is not what I wanted to write about. There is something much more important than arguing about the age of OpenAM code. I believe that OpenAM is a disaster. But it is an open source disaster. Even if it is bad I was able to fix it and make it work. It was not easy and it consumed some time and money. But it is still better than my usual experience with the support of closed-source software vendors. Therefore I believe that any closed-source AM system is inherently worse than OpenAM. Why is that, you ask?

Firstly, I was able to fix OpenAM by just looking at the source code. Without any help from ForgeRock. Nobody can do this for closed source system. Except the vendor. Running system is extremely difficult to replace. Vendors know that. The vendor can ask for an unreasonable sum of money even for a trivial fix. Once the system is up and running the customer is trapped. Locked in. No easy way out. Maybe some of the vendors will be really nice and they won't abuse this situation. But I would not bet a penny on that.

Secondly, what are the chances of choosing a good product in the first place? Anybody can have a look at the source code and see what OpenAM really is before committing any money to deploy it. But if you are considering a closed-source product you won't be able to do that. The chances are that the product you choose is even worse. You simply do not know. And what is even worse is that you do not have any realistic chance to find it out until it is too late and there is no way out. I would like to believe that all software vendors are honest and that all glossy brochures tell the truth. But I simply know that this is not the case...

Thirdly, you may be tempted to follow the "independent" product reviews. But there is a danger in getting advice from someone who benefits from cooperation with the software vendors. I cannot speak about the whole industry as I'm obviously not omniscient. But at least some major analysts seem to use evaluation methodologies that are not entirely transparent. And there might be a lot of motivations at play. Perhaps the only way to be sure that the results are sound is to review the methodology. But there is a problem. The analysts are usually not publishing details about the methodologies. Therefore what is the real value of the reports that the analysts distribute? How reliable are they?

This is not really about whether product X is better than product Y. I believe that this is an inherent limitation of the closed-source software industry. The risk of choosing inadequate product is just too high as the customers are not allowed to access the data that are essential to make a good decision. I believe in this: the vendor that has a good product does not need to hide anything from the customers. So there is no problem for such a vendor to go open source. If the vendor does not go open source then it is possible (maybe even likely) that there is something he needs to hide from the customers. I recommend to avoid such vendors.

It will be the binaries built from the source code that will actually run in your environment. Not the analyst charts, not the pitch of the salesmen, not even the glossy brochures. The source code is only thing that really matters. The only thing that is certain to tell the truth. If you cannot see the source code then run away. You will probably save a huge amount of money.

(Reposted from https://www.evolveum.com/comparing-disasters/)

24 Mar 2015 6:09pm GMT

Vittorio Bertocci - Microsoft: Identity Libraries: Status as of 03/23/2015

Time for another update to the libraries megadiagram! If you are nostalgic, you can find the old one here.

So, what's new? Well, we added an entire new target platform, .NET core and associated ASP.NET vNext - which resulted in a new drop of ADAL .NET 3.x and new OpenId [...]

24 Mar 2015 6:18am GMT

Gerry Beuchelt - MITRE: CI and CND – Revisited

About this time last year I discussed my thoughts on Counterintelligence (CI) and Computer Network Defense (CND). My basic proposition then was that CND is materially identical (or - more precisely - a monomorphism) to a restriction of CI to Cyber activities. I think that I was way to hesitant in making this claim. After Continue Reading →

24 Mar 2015 12:54am GMT

23 Mar 2015

feedPlanet Identity

Bill Nelson - Easy Identity: Hacking OpenAM – An Open Response to Radovan Semancik

I have been working with Sun, Oracle and ForgeRock products for some time now and am always looking for new and interesting topics that pertain to theirs and other open source identity products. When Google alerted me to the following blog posting, I just couldn't resist: Hacking OpenAM, Level: Nightmare Radovan Semancik | February 25, 2015 […]

23 Mar 2015 6:06pm GMT