Quantcast
Channel: liferay Archives | XTIVIA
Viewing all 53 articles
Browse latest View live

Create Your Own Liferay Custom Registration Process

$
0
0

Liferay makes it easy to customize the registration process. Recently a customer required Liferay to do email address verification directly after registration and then auto login after verifying. This post shows how to use a hook to create your own Liferay custom registration process.

To use a hook, override two classes in liferay-hook.xml:

<struts-action>
        <struts-action-path>/login/create_account</struts-action-path>
        <struts-action-impl>com.xtivia.hook.CustomCreateAccountAction</struts-action-impl>
    </struts-action>
    <struts-action>
        <struts-action-path>/portal/verify_email_address</struts-action-path>
        <struts-action-impl>com.xtivia.hook.CustomVerifyEmailAddressAction</struts-action-impl>
</struts-action>

The first class, CustomCreateAccountAction, is the class called when the user registers. The code needs to redirect to the email verification page if successful.

actionResponse.sendRedirect(themeDisplay.getPathMain() + "/portal/verify_email_address");

When users are required to verify their email address, Liferay creates a Ticket object with a verification code. The verification code is sent to the user with a link via email. The link points to a page where the user can enter the verification code they were sent. The unique verification code is looked up in the Ticket table, and if matched the email address is considered verified.

In the second class, CustomVerifyEmailAddressAction, I needed to add the code below to validate the user verification key input and fetch the user that just verified. Here is how it is accomplished.

protected User verifyEmailAddress(HttpServletRequest request, HttpServletResponse response, ThemeDisplay themeDisplay)
        throws Exception {
    AuthTokenUtil.checkCSRFToken( request, CustomVerifyEmailAddressAction.class.getName());
     
    String ticketKey = ParamUtil.getString(request, "ticketKey");
    Ticket tick = TicketLocalServiceUtil.getTicket(ticketKey);
    //this is the email of the user
    String email = tick.getExtraInfo();
    UserLocalServiceUtil.verifyEmailAddress(ticketKey);
    return UserLocalServiceUtil.getUserByEmailAddress(themeDisplay.getCompanyId(), email);
}

Add a new method for auto login

public static void login(User user, HttpServletRequest request) throws Exception {
    String username = String.valueOf(user.getUserId());
    String password = user.getPassword();
    boolean encPassword = user.isPasswordEncrypted();
    HttpSession session = request.getSession();
    session.setAttribute("j_username", username);
    if (encPassword) {
        session.setAttribute("j_password", password);
    } else {
        throw new Exception("Password encryption not implemented");
    }
    session.setAttribute("j_remoteuser", username);
}

Finally, in the execute method of CustomVerifyEmailAddressAction, call the two methods this way:

User user = verifyEmailAddress(request, response, themeDisplay);
if(user != null){
  login(user, request);
  response.sendRedirect(PortalUtil.getPathMain() + "/portal/login");
}

 

The post Create Your Own Liferay Custom Registration Process appeared first on Xtivia.


Liferay Salesforce Opportunities Portlet

$
0
0

Organizations are showing a lot of interest in Sales numbers, and it’s not just those directly involved in Sales. This typically means extracting information from a CRM, the most prominent choice being Salesforce. IT teams are always looking for solutions that is freely available to present this data to their enterprise via their intranet portals. The objective of this blog to present such a solution.

At Xtivia, we are planning to release a number of such Salesforce Applications with highly configurable options in Liferay Marketplace to help enterprises meet their desires with respect to Salesforce CRM integration.

This solution is a SPA (Single Page Application) Portlet for Liferay that is written using AngularJS. The portlet takes care of followings:

  • It provides a configuration screen for administrators to store connection parameters for a Salesforce instance.
  • Authentication to Salesforce using OAuth2.
  • Fetch the Account/Opportunities information from Salesforce via the REST API.
  • Filter the Opportunities information with pre-defined filter criteria.
  • It also uses one of XTIVIA’s internal frameworks called XSF. XSF resolves the issue of CORS (Cross-Origin Resource Sharing) between your application domain and force.com when making AJAX calls.

To use this portlet, download it and install from the Liferay Marketplace, and then place on a page of your choice. Before you can figure the portlet, however, you will need to create a Salesforce Connected App so that this portlet can connect to Salesforce. After you’ve registered your Salesforce app, enter the Salesforce app connection parameters in the portlet configuration. Below is a screen shot of the configuration.

Salesforce Opportunities Portlet Configuration

Salesforce Opportunities Portlet Configuration

And now the portlet is now ready to show the Salesforce data. Here is how the Salesforce Opportunities portlet presents the Salesforce data in your Liferay Portal.

Salesforce Opportunities List

Salesforce Opportunities List

Salesforce Opportunity Detail

Salesforce Opportunity Detail

Based on the feedback we receive on this, we will be enhancing this portlet with more features. The portlet is meant to present the upcoming opportunities to wider range of audience from the Salesforce system inside an organization.

If you have any questions or ideas for enhancements, please reach out to us at: info@xtivia.com

RELATED PAGES

The post Liferay Salesforce Opportunities Portlet appeared first on Xtivia.

Integrating TED Events with Liferay

$
0
0

TED is an incredible non-profit organization. It provides an excellent platform for influential speakers to stir audience curiosity with some of the most awesome ideas on this planet. TED, as their Ideas worth spreading tagline suggests, provides an API to power this mission. XTIVIA has developed a portlet to bring TED’s incredible content to Liferay community. This portlet is available in the Liferay Marketplace and has been tested with Liferay 6.2 CE and EE.

The TED event portlet is a SPA (Single Page Application) portlet for Liferay written in AngularJS. The portlet takes care of following:

  • It provides a configuration screen for administrators to store a TED API Key.
  • Fetches the TED Events information and present them in a cards layout.
  • It also uses one of XTIVIA’s internal frameworks called XSF. XSF resolves the issue of CORS (Cross-Origin Resource Sharing) between your application domain and the TED API when making AJAX calls.

In its current form, this portlet presents TED events and details provided through the basic TED API. The portlet follows the usage and branding guidelines of TED API usage as suggested at http://developer.ted.com/terms and http://developer.ted.com/Branding_Guidelines.

To use this portlet, download it and install from the Liferay Marketplace. Then place it on a page of your choice. Before you can configure the portlet, however, you will need your own TED API key. Register with TED at http://developer.ted.com and then create an application. TED will issue an API key via email to access TED data. Below is a screen shot showing the application section of the TED site.

TED Application Configuration

TED Application Configuration

Once you have API key, just enter this value in your portlet configuration and you are all set.

TED Portlet Conifguration

The portlet shows the list of events available from the API (https://api.ted.com/v1/tedx_events.json?api-key=<your-api-key>). Each event is displayed as a YouTube video. The event details are available by clicking on the flip icon (the arrow in the upper right of each card).

TED Events List

TED Events List

TED Event Detail

TED Event Detail

In the future, we intend to enhance this portlet with more content from TED, like TED Talks, conference details, and support for TED API users. Till then stay tuned.

If you have any questions or ideas for enhancements, please reach out to us at: info@xtivia.com

RELATED PAGES

The post Integrating TED Events with Liferay appeared first on Xtivia.

Liferay Clustering with TCP Unicast

$
0
0

A key step for setting up a reliable Liferay environment is to configure Liferay clustering. Liferay clustering can be achieved in multiple ways. Some of the supported clustering methods are UDP Multicast, UDP Unicast and TCP Unicast. While Multicast is recommended for better clustering performance, in most enterprise environments and in all Cloud environments such as AWS and Microsoft Azure, UDP Multicast is disabled. As a general practice, if UDP Multicast is not available or disabled in the infrastructure, Xtivia recommends TCP Unicast for clustering Liferay. TCP Unicast is typically more reliable than UDP Unicast for Liferay clustering.

This post covers the steps required to setup TCP Unicast clustering in Liferay 6.2 EE environments. The same instructions apply to Enterprise infrastructure, Azure and AWS platforms. Also, throughout this blog post common locations have been abstracted out as variable names; this is to maintain a sense of brevity throughout the document.  These variables do not have any meaning outside of the document itself.  Example variables include: $host1_FQDN, $name_of_the_s3bucket, $AWS_access_key, $AWS_secret_key etc.,

Configure TCP Unicast clustering for Liferay

To achieve TCP Unicast clustering for a Liferay environment, the following steps need to be performed in all the Liferay instances that belong to a cluster.

Step 1. Ensure that ports are open

For Liferay clustering to be successful, the hosts need to be able to communicate with each other to send clustering packets across the network. For our example, we will use port 7800 for TCP Unicast clustering in Liferay. In this post we have provided an example configuration to use TCP Ping and AWS S3 ping for Liferay clustering. Other options to configure Liferay clustering with TCP Unicast are File ping and JDBC ping.

Step 2. Configure TCP Unicast

2.1 For non-AWS environment

For the next step, an XML file containing the TCP Unicast configuration should be configured. Please note that we recommend the file be copied to the global library for the AppServer so that the file can be read by the Liferay application on startup.

<!--
TCP based stack, with flow control and message bundling. This is usually used when IP
multicasting cannot be used in a network, e.g. because it is disabled (routers discard
multicast). Note that TCP.bind_addr and TCPPING.initial_hosts should be set, possibly
via system properties, e.g. -Djgroups.bind_addr=192.168.5.2 and
-Djgroups.tcpping.initial_hosts=192.168.5.2[7800]
author: Bela Ban
-->
<config xmlns="urn:org:jgroups"
		xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
		xsi:schemaLocation="urn:org:jgroups http://www.jgroups.org/schema/JGroups-3.1.xsd">
	<TCP bind_port="7800"
			singleton_name="Liferay"
			loopback="false"
			recv_buf_size="${tcp.recv_buf_size:5M}"
			send_buf_size="${tcp.send_buf_size:640K}"
			max_bundle_size="64K"
			max_bundle_timeout="30"
			enable_bundling="true"
			use_send_queues="true"
			sock_conn_timeout="300" 
                        timer_type="old"
			timer.min_threads="4"
			timer.max_threads="10"
			timer.keep_alive_time="3000"
			timer.queue_max_size="500"
			thread_pool.enabled="true"
			thread_pool.min_threads="1"
			thread_pool.max_threads="10"
			thread_pool.keep_alive_time="5000"
			thread_pool.queue_enabled="false"
			thread_pool.queue_max_size="100"
			thread_pool.rejection_policy="discard" 
                        oob_thread_pool.enabled="true"
			oob_thread_pool.min_threads="1"
			oob_thread_pool.max_threads="8"
			oob_thread_pool.keep_alive_time="5000"
			oob_thread_pool.queue_enabled="false"
			oob_thread_pool.queue_max_size="100"
			oob_thread_pool.rejection_policy="discard"/>
	<TCPPING timeout="3000"
		initial_hosts=
                        "$host1_FQDN[7800],$host2_FQDN[7800],$host3_FQDN[7800],$host4_FQDN[7800]"
			port_range="1"
			num_initial_members="10"/>
	<MERGE2 min_interval="10000"
			max_interval="30000"/>
	<FD_SOCK/>
	<FD timeout="3000" max_tries="3" />
	<VERIFY_SUSPECT timeout="1500" />
	<BARRIER />
	<pbcast.NAKACK2 use_mcast_xmit="false"
			discard_delivered_msgs="true"/>
	<UNICAST />
	<pbcast.STABLE stability_delay="1000" 
                        desired_avg_gossip="50000"
			max_bytes="4M"/>
	<pbcast.GMS print_local_addr="true" 
                        join_timeout="3000" 
                        view_bundling="true"/>
	<UFC max_credits="2M"
			min_threshold="0.4"/>
	<MFC max_credits="2M"
			min_threshold="0.4"/>
	<FRAG2 frag_size="60K" />
	<!--RSVP resend_interval="2000" 
                        timeout="10000"/-->
	<pbcast.STATE_TRANSFER/>
</config>

2.2 For AWS environment

For Liferay deployed on AWS,  we would recommend the following configuration that uses AWS S3 ping for clustering. Please note that we recommend the file be copied to the global library for the AppServer so that the configuration file can be read by Liferay application on startup. Please note that before configuring, an S3 bucket that can be access by all Liferay instances needs to be created before configuring TCP Unicast clustering for Liferay instances.

<!--
TCP based stack, with flow control and message bundling. This is usually used when IP
multicasting cannot be used in a network, e.g. because it is disabled (routers discard
multicast).Note that TCP.bind_addr and TCPPING.initial_hosts should be set, possibly
via system properties, e.g.-Djgroups.bind_addr=192.168.5.2 and
-Djgroups.tcpping.initial_hosts=192.168.5.2[7800]
author: Bela Ban
-->
<config xmlns="urn:org:jgroups"
		xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
		xsi:schemaLocation="urn:org:jgroups http://www.jgroups.org/schema/JGroups-2.12.xsd">
	<TCP bind_port="7800"
			singleton_name="LIFERAY_CLUSTER"
			loopback="true"
			recv_buf_size="${tcp.recv_buf_size:20M}"
			send_buf_size="${tcp.send_buf_size:640K}"
			discard_incompatible_packets="true"
			max_bundle_size="64K"
			max_bundle_timeout="30"
			enable_bundling="true"
			use_send_queues="true"
			sock_conn_timeout="300"
			timer_type="new"
			timer.min_threads="4"
			timer.max_threads="10"
			timer.keep_alive_time="3000"
			timer.queue_max_size="500"
			thread_pool.enabled="true"
			thread_pool.min_threads="1"
			thread_pool.max_threads="10"
			thread_pool.keep_alive_time="5000"
			thread_pool.queue_enabled="false"
			thread_pool.queue_max_size="100"
			thread_pool.rejection_policy="discard"
			oob_thread_pool.enabled="true"
			oob_thread_pool.min_threads="1"
			oob_thread_pool.max_threads="8"
			oob_thread_pool.keep_alive_time="5000"
			oob_thread_pool.queue_enabled="false"
			oob_thread_pool.queue_max_size="100"
			oob_thread_pool.rejection_policy="discard"/>
	<S3_PING location="$name_of_the_s3bucket" 
			access_key="$AWS_access_key"
			secret_access_key="$AWS_secret_key" 
			timeout="2000"
			num_initial_members="2"/>
	<MERGE2 min_interval="10000"
			max_interval="30000"/>
	<FD_SOCK/>
	<FD timeout="3000" max_tries="3" />
	<VERIFY_SUSPECT timeout="1500" />
	<BARRIER />
	<pbcast.NAKACK2
			use_mcast_xmit="false"
			discard_delivered_msgs="true"/>
	<UNICAST timeout="300,600,1200" />
	<pbcast.STABLE stability_delay="1000" 
			desired_avg_gossip="50000"
			max_bytes="4M"/>
	<pbcast.GMS print_local_addr="true" 
			join_timeout="3000"
			view_bundling="true"/>
	<UFC max_credits="2M"
			min_threshold="0.4"/>
	<MFC max_credits="2M"
			min_threshold="0.4"/>
	<FRAG2 frag_size="60K" />
</config>

Step 3. Add properties to portal-ext.properties

For Liferay to use TCP Unicast clustering, the following properties need to be included in portal-ext.properties. Please note that we recommend setting the cluster link auto detect address to the database host and port. The assumption we make here is that the database host is always available and can be used by Liferay instances to accurately determine the network interface to use for clustering purposes.

##
## Cluster Link
##
#
# Set this to true to enable the cluster link. This is required if you want
# to cluster indexing and other features that depend the cluster link.
#
cluster.link.enabled=true
#cluster link channel properties
cluster.link.channel.properties.control=$name_of_the_tcp_unicast.xml
cluster.link.channel.properties.transport.0=$name_of_the_tcp_unicast.xml
cluster.link.autodetect.address=$database_host:$port
ehcache.cluster.link.replication.enabled=true

Step 4: Deploy ehcache-cluster-web application

As a final step for configuring Liferay clustering, the Ehcache Cluster EE application from Liferay Marketplace needs to be deployed to all Liferay instances. Currently this can be found at https://www.liferay.com/marketplace/-/mp/application/15099166.

Step 5: Restart Liferay and verify

To verify that clustering is working as expected, you can take a look at the the AppServer logs.  The logs should contain the lines similar to the following showing successful cluster initialization:

INFO [localhost-startStop-1][ClusterBase:142] Autodetecting JGroups outgoing IP address
and interface for $database_host:port
INFO [localhost-startStop-1][ClusterBase:158] Setting JGroups outgoing IP address to
172.31.42.34 and interface to eth0
-------------------------------------------------------------------
GMS: address=ip-172-31-42-34-20199, cluster=LIFERAY-CONTROL-CHANNEL, physical
address=172.31.42.34:7800
-------------------------------------------------------------------
-------------------------------------------------------------------
GMS: address=ip-172-31-42-34-39646, cluster=LIFERAY-TRANSPORT-CHANNEL-0, physical
address=172.31.42.34:7800
-------------------------------------------------------------------

and lines similar to the following showing successful connectivity between the cluster members:

INFO  [localhost-startStop-1][BaseReceiver:64] Accepted view [ip-172-31-35-224-26939|11]
[ip-172-31-35-224-26939, ip-172-31-42-34-39646]

Alternatively, you could make changes to webcontent article or pages on one instance of the cluster and verify if the changes are replicated on the other members of the cluster.

 

 

The post Liferay Clustering with TCP Unicast appeared first on Xtivia.

Enable site wide settings for the Liferay Language Portlet

$
0
0

In this post, I am going to show you how to enable site wide settings for the Language Portlet, so you can add the portlet to your header or footer without having to edit the settings for each and every page.

The Problem

In Liferay, the Language Portlet uses a different set of settings for every page.  This is fine if you have the Portlet added to only one page on your site.  However, if you need it on multiple pages, like on the header or footer, you will need the settings set the same for every page.

languageConfig

Default Settings popup for the Language Portlet. Be careful, these are different for every page!

 

The Solution

I found the simplest solution for this issue is to create a jsp hook and pull the settings from portal-ext.properties, instead of the portlet’s config screen.   Of course, the one disadvantage to this approach, is that you need to restart Liferay whenever you make a change.  However, I don’t see this as a big issue because changing available languages and display styles is usually not done frequently.

Embedding the Language Portlet inside your theme

Inside portal_normal.ftl, assuming you are using velocity, add this snippet where you want the Language Portlet displayed:

${theme.runtime("82")}

Override JSP files using a hook

This is very simple, because you only need to touch two files, which are located in webapps\ROOT\html\portlet\language.

You need only to override the configuration.jsp and init.jsp files using a hook.  Note: do not change these files directly.  Make sure you use a hook.

configuration.jsp:

The  code in this file should be completely commented out, so it does not confuse anyone who tries to use it later.  I added this reminder note at the top of the file which is displayed if the admin user clicks the config link on the portlet topper:

Note: This config screen is now obsolete, due to the language-hook.
The settings for supportedLocales, displayStyle and displayCurrentLocale are now pulled from
portal-ext.properties.

Everything else in the file must be commented out.

init.jsp:

You want to comment out the lines that set the languageIds, displayCurrentLocale, and displayStyle and get those values from portal-ext.properties instead.  here is a before and after shot of that file:
Before:

Locale[] availableLocales = LanguageUtil.getAvailableLocales(themeDisplay.getSiteGroupId());
String[] availableLanguageIds = LocaleUtil.toLanguageIds(availableLocales);
String[] languageIds = StringUtil.split(portletPreferences.getValue("languageIds", StringUtil.merge(availableLanguageIds)));
boolean displayCurrentLocale = GetterUtil.getBoolean(portletPreferences.getValue("displayCurrentLocale", null), true);
int displayStyle = GetterUtil.getInteger(portletPreferences.getValue("displayStyle", StringPool.BLANK));

After:


Locale[] availableLocales = LanguageUtil.getAvailableLocales(themeDisplay.getSiteGroupId());
String[] availableLanguageIds = LocaleUtil.toLanguageIds(availableLocales);
/*------------------------------------------------------------------
The following lines are commented out, because we want to get all settings from
one place in order to make them the same across the entire instance.
String[] languageIds = StringUtil.split(portletPreferences.getValue("languageIds", StringUtil.merge(availableLanguageIds)));
boolean displayCurrentLocale = GetterUtil.getBoolean(portletPreferences.getValue("displayCurrentLocale", null), true);
int displayStyle = GetterUtil.getInteger(portletPreferences.getValue("displayStyle", StringPool.BLANK));
------------------------------------------------------------------*/
int displayStyle = Integer.parseInt(PropsUtil.get("languagePortlet.displayStyle"));
boolean displayCurrentLocale = Boolean.parseBoolean(PropsUtil.get("languagePortlet.displayCurrentLocale"));
String[] languageIds = StringUtil.split(PropsUtil.get("locales.enabled"));

portal-ext.properties:


# FYI: locales.enabled is a default property, but it is also being used for the custom language-hook
locales.enabled=en_US,ja_JP
# displayStyle:
# 0=Icon
# 1=Long Text
# 2=Short Text
# 3=Select Box
languagePortlet.displayStyle=2
languagePortlet.displayCurrentLocale=true

Wrapup

Once you start Liferay and deploy the hook, you will no longer need the default config screen.  All instances of your portlet will use the exact same settings.

Enjoy!

The post Enable site wide settings for the Liferay Language Portlet appeared first on Xtivia.

Form parameters not being passed to portlets on Liferay

$
0
0

During Liferay upgrades from v6.1 to v6.2, we often see our portlets not working correctly once deployed. The portlets often deploy correctly without any errors, and they show up in the category menu as expected. When it comes to using the forms, this is the piece that often does not work as expected. If you are experiencing an issue where your form parameters are not being passed to the portlet on Liferay, then you should know there is a simple fix.

form parameters not being passed to portlet on Liferay

This issue with form parameters not being passed to portlets on Liferay will not happen for every portlet. It depends on how the portlet was written and what portlet framework was used. If you don’t want to change any of your portlet code, then you can usually add this snippet to your liferay-portlet.xml:

<requires-namespaced-parameters>false</requires-namespaced-parameters>

For more information on this topic, see this blog post from Liferay: https://www.liferay.com/web/meera.success/blog/-/blogs/liferay-requires-name-spaced-parameters

The post Form parameters not being passed to portlets on Liferay appeared first on Xtivia.

Create a Zoomable Image Portlet for Liferay

$
0
0

This blog shows you how to create a portlet to be able to zoom in and out of an image. This portlet is web content driven using Liferay 6.2, jquery and animate.css. I have included all the source code for this project at the bottom of this page.

How it Looks:

When you first bring up the portlet, you see an image in a scrollbar with various points defined. The scrollbar is required & important. The points are defined using x & y coordinates, so to ensure they always point to the same location on the image, the image is not resizable. If this is viewed in on a mobile device, the scrollbar will appear with points pointing to the same spot on the image.

Main Image

After you click on a point, you’ll see more details, as shown below. To close the image, click ‘close’ and the portlet zooms back to the original image.

Zoomed in Portlet Image

Technical Details

  • This portlet uses animate.css to animate the images as they float into the viewport.
  • The Java portlet code basically does the following:
    • Looks up the web content using a specified tag.
    • Pulls the required values out of the web content, using the Liferay XML parsing API.
    • Instantiates a zoomable class and puts it into the request.

Structure:
The Web Content structure contains the following fields:

  1. Title
  2. MainImage
  3. Repeatable Points, including
    1. Title
    2. Description
    3. Image
    4. X coordinate
    5. Y coordinate

The src for this structure is also in the ZoomableStructure.xml file inside the zip file at the bottom of this page.

Java Portlet Code

JournalArticle wc = JournoalArticleLocalServiceUtil.getArticle(groupId, String.valueOf(ae.getClassPK() - 2));
Document journalDoc = SAXReaderUtil.read(wc.getContentByLocale(themeDisplay.getLocale().toString()));
Document titleDoc = SAXReaderUtil.read(wc.getTitle());Node mainImageNode = journalDoc.selectSingleNode(String.format("root/dynamic-element[@name='%1$s']/dynamic-content[@language-id='%2$s']", "mainImage", locale));
Node mainTitleNode = titleDoc.selectSingleNode(String.format("//Title[@language-id='%1$s']", locale));

z = new Zoomable(mainTitleNode.getText(), mainImageNode.getText());

Node root = journalDoc.getRootElement();

List<Node> pointNodes = root.selectNodes("/root/dynamic-element[@name='pointEnabled']");

ZoomPoint point;
for (int i = 0; i < pointNodes.size(); i++) {
Node pointNode = pointNodes.get(i);

Node enabledNode = pointNode.selectSingleNode(String.format("dynamic-content[@language-id='%1$s'][last()]",locale));
Node titleNode = pointNode.selectSingleNode(String.format("dynamic-element[@name='%1$s']/dynamic-content[@language-id='%2$s']", "title", locale));

if (Boolean.valueOf(enabledNode.getText())) {
Node descNode = pointNode.selectSingleNode(String.format("dynamic-element[@name='%1$s']/dynamic-content[@language-id='%2$s']", "description", locale));
Node imageNode = pointNode.selectSingleNode(String.format("dynamic-element[@name='%1$s']/dynamic-content[@language-id='%2$s']", "image", locale));
Node xNode = pointNode.selectSingleNode(String.format("dynamic-element[@name='%1$s']/dynamic-content[@language-id='%2$s']", "x", locale));
Node yNode = pointNode.selectSingleNode(String.format("dynamic-element[@name='%1$s']/dynamic-content[@language-id='%2$s']", "y", locale));

z.addPoint(titleNode.getText(),descNode.getText(),imageNode.getText(),Integer.valueOf(xNode.getText()),Integer.valueOf(yNode.getText()));
}
}

JSP snippet:

 
<div>
<div class="carousel-container animated fadeIn" style="overflow:auto">
<div class="row-fluid">
<div class="span10">
<h3 class="outside-text">
Zoom to see Details: <c:out value='${requestScope.zoomable.title}'/>
</h3>
</div>
</div>
<div class="row-fluid options" >
<div class="main-module animated slideInRight">
<div class="tooltips animated bounceInDown">
<c:forEach var="point" items="${requestScope.zoomable.points}">
<div id="point3" class="rotate point modulePoint"
data-img="<c:out value='${point.image}'/>" data-title="<c:out value='${point.title}'/>" data-desc="<c:out value='${point.desc}'/>"
style="top:<c:out value='${point.y}'/>px;left: <c:out value='${point.x}'/>px;"
onClick="var image = $(this).attr('data-img');
$('.main-module > img').animate({zoom:1.3, opacity:0.0 }, 500);
$('.tooltips').toggle();
$('.zoomImage > img').attr('src', image);
$('.zoom-module').css({display: 'block'});
$('.zoom-module').show();
$('#zoomTitle').html($(this).attr('data-title'));
$('.zoomOutClose').show();
$('#zoomDesc').html($(this).attr('data-desc'));">
<i class="fa fa-search-plus"></i>
</div>
</c:forEach>
</div>
<image src="<c:out value='${requestScope.zoomable.mainImage}'/>" />
</div>
</div>
<div class="zoom-module">
<div onClick="var points = $('.tooltips');
var zoomModule = $('.zoom-module');
zoomModule.addClass('fadeOut');
$('.main-module > img').animate({zoom:1, opacity:1.0}, 500);
zoomModule.removeClass('animated fadeIn');
zoomModule.css({display: 'none'});
points.toggle();" id="zoomOut" class="close">
<span class="smalltext">Close<i class="fa fa-search-minus"></i></span>
</div>
<div class="row-fluid">
<div class="zoomText animated fadeIn span6">
<h2 id="zoomTitle"></h2>
<p id="zoomDesc"></p>
</div>
<div class="zoomImage animated slideInRight span6">
<img src="" style="">
</div>
</div>
</div>
</div>
</div>

Src code
Download ZoomIn-portlet src code.

The post Create a Zoomable Image Portlet for Liferay appeared first on Xtivia.

Warning: Custom code rendering Liferay web content might not be cached

$
0
0

If you have custom code for rendering Liferay web content, you may not know it but you may not be leveraging Liferay caching for the rendered web content, and this may be at the root of some performance problems in your Liferay environment. Recently, one of our clients ran into this performance issue on one of their sites and I figured that I would share this issue and its solution with the Liferay community.

Oddly enough it seems the method that you’d assume you should use to get article content is badly behaved. The method in question is the JournalArticleLocalService’s getArticleContent method. This method should never be used in client code, as it automatically goes to the database to render your content request, entirely bypassing the cache.

Replace it with JournalContentUtil.getContent method. This first checks the Liferay cache. If the article is present in the Cache, it uses that. Otherwise, it delegates down to the JournalArticleLocalService getArticleContent method.

Make this change and your site should start to perform the way you and your end-users expect.

The post Warning: Custom code rendering Liferay web content might not be cached appeared first on Xtivia.


XTIVIA Services Framework (XSF) – Update

$
0
0

XSF is a framework that XTIVIA has created (and used for multiple client engagements) that enables the rapid development of custom REST services in Liferay. REST services developed using XSF are coded in a fashion similar to JAX-RS or Jersey but can take advantages of Liferay features such as roles/permissions.

We have talked about XSF here before and just wanted to provide you with a brief update on its current status. We’ve now included the source code for the framework inside the XSF repository on GitHub. It is located in the ‘framework’ directory and can be built as a separate JAR; however you can still create your XSF based applications using either a Maven archetype or by using the version 1.1.0 JAR available on Maven Central.

Probably the best starting point is to read the PDF documentation for the framework, which can be found here.

We’ve also updated the current version of XSF to support Gradle builds (both the sample application as well as the framework itself). While the current version of XSF is still targeted to Liferay 6.2, Liferay has clearly committed to Gradle as its build tool of choice going forward in Liferay 7 and beyond. We really enjoy using Gradle and think you will too.

Speaking of Liferay 7, we are actively working on migrating XSF to Liferay 7 and we expect to have some exciting news to share in this space soon, so keep checking back here for updates.

The post XTIVIA Services Framework (XSF) – Update appeared first on Xtivia.

Handling transactions across multiple Liferay API calls

$
0
0

There are times when you want to handle multiple Liferay API calls as a single, atomic transaction. For example, let’s say you have a business process that has the following 3 steps:

  1. Create an Organization.
  2. Add a Role to the User.
  3. Update the User’s status.

Let’s say steps 1 & 2 complete successfully, but step 3 fails for some reason. In this case, you should roll back the previous 2 steps.

To do this in Liferay, simply create a new Service Builder project and add all your API calls inside the *ServiceImpl class and make sure the method throws either a SystemException or a PortalException. No other config changes are needed to package these calls into a single transaction.

Here is an example below.
For testing, if “bad” is passed in for the username, the method will throw a SystemException. This will cause the previous statements in the method to rollback. Of course, remove that statement before deploying this code.

public void update(User user, long[] roleIds, String orgName)
throws SystemException, PortalException {
long orgId = CounterLocalServiceUtil.increment(Organization.class.getName());
Organization org = OrganizationLocalServiceUtil.getService().
createOrganization(orgId);
org.setName(orgName);
OrganizationLocalServiceUtil.getService().updateOrganization(org);
UserLocalServiceUtil.getService().updateUser(user);
if (user.getFirstName().equals("bad")) {
throw new SystemException("Exception thrown due to firstName = 'bad'");
}
RoleLocalServiceUtil.getService().addUserRoles(user.getUserId(), roleIds);
}

The post Handling transactions across multiple Liferay API calls appeared first on Xtivia.

Tuning Basic JVM Performance for Liferay DXP 7

$
0
0

Over the nearly 10 years that we have worked with the Liferay platform, we have had ample opportunity to hone our understanding of how Liferay interacts with the Java virtual machine (JVM), and how to optimally tune the JVM performance for Liferay as a Java application.

Increase JVM Heap Sizing

Out of the box, the Liferay DXP bundle ships with a set of JVM parameters that are well-geared towards development and experimental usage scenarios, but which are not optimal for testing, staging, and production-type environments. One of the first modifications that a well-tuned Liferay environment needs is a change to the default JVM heap sizing. XTIVIA has found that a good starting point for a Liferay implementation in a shared environment is to set the heap size statically at 3 gigabytes, with approximately half of that space used for the young generation, and a 768 megabyte permanent generation. This allows sufficient heap space to allow in-memory caches to operate well, while keeping garbage collection stop-the-world events down to a minimal duration.

To implement this, add or replace the following to the CATALINA_OPTS variable in ${liferay.home}/tomcat-8.0.32/bin/setenv.sh (or ${liferay.home}/tomcat-8.0.32/bin/setenv.bat for a Microsoft Windows environment):

-Xmx3G -Xms3g -XX:NewSize=1536m -XX:MaxNewSize=1536m -XX:MaxPermSize=768m -XX:PermSize=768m

Enable G1GC

Our experience with Liferay has been that garbage collection optimization is a critical part of a well-tuned environment; historically, this has required the use of the concurrent-mark-sweep garbage collection policy, with all of its reliance on fine-tuning to optimize performance. Luckily for us, Liferay DXP version 7.0 has made Java 7 a baseline requirement, which allows us to leverage the far superior G1GC policy developed by Oracle to guarantee superlative performance with a minimum of tuning overhead. At this point, XTIVIA recommends that all of our Liferay implementations leverage the G1GC garbage collection policy; to enable it, add the following to the same CATALINA_OPTS definition in the setenv.sh (or setenv.bat) file referenced above.

-XX:+UseG1GC

If you are using an environment configured with Java 8 (rather than Java 7), add the following to the above statement to leverage String deduplication:

-XX:+UseStringDeduplication

Note that if you enable G1GC, the following JVM settings included above are no longer necessary.

-XX:NewSize=1536m
-XX:MaxNewSize=1536m
-XX:MaxPermSize=768m
-XX:PermSize=768m

One other change that XTIVIA does recommend is enabling JVM garbage collection logging for all installations; details on this will be included in a later blog post.

The post Tuning Basic JVM Performance for Liferay DXP 7 appeared first on Xtivia.

Using Liferay Resources Importer to Develop Site Content

$
0
0

I’m going to share with you some tips and tricks I’ve used to keep track of changes with web content during the development stage of a project. To set up the scene, I’m a Front-End developer on a small project with a few back-end developers. The client has a lot of starter content existing in their mock-ups that they want created on the development server and then moved to the QA environment. During these beginning stages of development other developers may want their local environment to be as close to the development server as possible. This is where we can let Resources Importer do the heavy lifting to help us generate a .LAR file that can be imported to each developer’s local environment.

If you are not familiar with resources importer, it allows you to package content, pages, documents, and configurations inside a Liferay theme. Then when the theme is deployed it will create a site template or a site depending on the configuration. In Liferay, web content can be created using structures and templates. As the client gives feedback, these structures and templates may change over time. During the development stage of the project, it’s a good idea to put these structures and templates into your source control system (svn, git).  The resources importer uses a common pattern to organize the files in a way that the dependencies make sense. The folder structure explains which templates are linked to which structures, then which template to use for the web content article.

Ok now I’m going to get into the technical steps to using the resources importer.

Step One – require resources importer dependency and set to developer mode.
/WEB-INF/liferay-plugin-package.properties add the following properties.


required-deployment-contexts=\
resources-importer-web

resources-importer-developer-mode-enabled=true

the developer mode will delete and rebuild the site template when the theme is deployed. (Note: don’t apply the site template to a site until all of the development is done, the developer mode can not delete a site template when it is being used by a site)

Step Two – organize folders (the folders below are for maven)


sample-theme
  - src
    - main
    - resources
      - resources-importer
        - document_library
          - documents
            - journal
              - articles
              - structures
              - templates


 

Step Three – define your pages/layouts in sitemap.json file
Create a sitemap.json file located at sample-theme/src/main/resources/resources-importer/


{
  "layoutTemplateId": "1_column",
  "publicPages": [
    {
      "friendlyURL":"/home",
      "name":"Home",
      "title":"Home"
    }
  ]
}

Step Four – create web content with structures and templates

First create the structure .xml for the web content. Place the file in sample-theme/src/main/resources/resources-importer/journal/structures/Basic Web Content.xml.
When creating the structure you must use Documents and Media for any images in order to reference them later in the journal article. Place the template file within the templates folder and have a folder match the structure name. sample-theme/src/main/resources/resources-importer/journal/templates/Basic Web Content/Basic Web Content.xml

Create the velocity or freemarker template and place in the templates folder and make sure the subfolder name matches the name of the structure. Ex: sample-theme/src/main/resources/resources-importer/journal/templates/Basic Web Content/Basic Web Content.vm

The Article is an xml file that has the structure filled out. The easiest way to get the xml file is create your web content using the structure and template and if resources importer is deployed a download button will be available when you create webcontent. Download this file as a starting point. Place the file in sample-theme/src/main/resources/resources-importer/journal/articles/Basic Web Content/Basic Web Content.xml. The folder Basic Web Content is associated with the template name that should be used.

Images that may be used in web content should be placed in sample-theme/src/main/resources/resources-importer/document_library/documents/
Then in the article xml file if you want to reference an item from Documents and Media use [$FILE=welcome_cube.png$]. Note that the image type structure can not be used with resources importer. It needs to be Media and Document Library item to be referenced in the web content.

Now to get your web content to display on a page. Let’s revisit the sitemap.json which is located in sample-theme/src/main/resources/resources-importer/sitemap.json

{
	"layoutTemplateId": "2_columns_ii",
	"publicPages": [
		{
			"columns": [
				[
					{
						"portletId": "58"
					}
				],
				[
					"Basic Web Content.xml"
				]
			],
			"friendlyURL": "/home",
			"name": "Welcome",
			"title": "Welcome"
		}
	]
}

layoutTemplateId : this corresponds to the layout id. The easiest way to find the layoutid view the layouts using the Edit from the dockbar, then inspect the layout thumbnail image. The name of the image is the layoutid for the OOTB Liferay Layouts.

Columns: has [] for each column of the layout.

“portletId”: “58”: 58 is the portletid of OOTB Sign-in Portlet

“Basic Web Content.xml”: The file name of the Journal Article

A good source to see the latest and greatest features of resources importer is to look in the test-resources-importer-portlet

Step Five – turn off developer mode

/WEB-INF/liferay-plugin-package.properties


resources-importer-developer-mode-enabled=false

Step Six (Optional) – Define the Name of the Site

/WEB-INF/liferay-plugin-package.properties

If you set the resources-importer-target-value then when the theme is deployed the site will be created, If those properties are not added then the resources importer will create a site template based off of the resources.


resources-importer-developer-mode-enabled=false
resources-importer-target-class-name=com.liferay.portal.model.Group
resources-importer-target-value=Sample Site

Advanced Options – Use Groovy Scripts/ Use Continuous Integration

Resources Importer currently doesn’t have every mapping that a lar file will create. But the benefit with building out the resources is that you have the source files and the content will work with most future versions. So functionality that isn’t included with Resources Importer can be completed with groovy scripts, such as creating users and blog posts.

Using Continuous Integration Solution such as Jenkins can be powerful if on the dev server each deployment will delete the site and regenerate the content. That way everyone is able to keep on the same page as content gets updated.

The post Using Liferay Resources Importer to Develop Site Content appeared first on Xtivia.

How to use Liferay Audience Targeting to Control Navigation Elements

$
0
0

Liferay’s Audience Targeting application raises the engagement experience of your portal to a whole new level. This app allows you to segment your audience, target specific content to different user segments, and create campaigns to target content to user segments. It also allows you to track user actions and generate reports that provide insight into the effectiveness of your campaigns.

As an example, in an intranet scenario for a global multinational company, you might use audience targeting to segment users by location and target content based on these location-based segments. This enables your users to only see content relevant to them and reduces the noise.

In this post I will walk you through a tutorial on how to show/hide navigation pages based on user segments. The notion is that we will defer to Liferay’s Role Based Access Control (RBAC) to filter the navigation based on the security set-up, and then we will apply another layer of filtering to filter out any pages that are not relevant to the currently logged-in user because of the user segments that he/she is not a member of.

The 3 cases I need to consider when displaying a page are –

  • Show pages that have no user segments associated to them
  • Display pages that may have categories assigned to them, but don’t have anything to do with user segments
  • Display pages that have user segments selected that match the current user.

SeviceLocator needs to be available to make this work. Edit your portal-ext.properies file with the following properties

velocity.engine.restricted.classes=
velocity.engine.restricted.variables=

To keep the code clean I like to define the main variables I'll be using for the theme in the init_custom.vm file.

init_custom.vm


## -------- Audience Targeting Section -------- ##
#set ($userSegmentLocalService = $serviceLocator.findService("content-targeting-api","com.liferay.content.targeting.service.UserSegmentLocalService"))
#set ($assetCategoryLocalService = $serviceLocator.findService("com.liferay.portlet.asset.service.AssetCategoryLocalService"))
#set ($userSegmentIds = $request.getAttribute("userSegmentIds"))

userSegmentIds returns all userSegmentIds that match the current user using the site.
userSegmentLocalService will be used to get more information about a particular userSegment.
assetCategoryLocalService is used to get the assetCategories for each page. Note that assetCategoryId is different than userSegmentId. Each userSegment has an assetCategoryId associated with it. So these are some of the parts that we will compare next.

navigation.vm

Above is the complete code for navigation.vm – I’m going to explain each part.


#set ($navItemCategoryIds = $assetCategoryLocalService.getCategoryIds("com.liferay.portal.model.Layout", $nav_item.getLayout().getPlid()))

Since the code I added, is inside of the foreach loop for $nav_item it iterates over each navigation item. The above code gets the assetCategoryIds associated with each page.

The first foreach loop iterates over each userSegmentId. Then gets the assetCategoryId associated with each userSegmentId. The second foreach loop iterates over each assetCategoryId for the $nav_item. I used two variables as flags: ($hasUserSegment, $ignoreCategory)


#foreach ($id in $userSegmentIds)
	#set ($userSegmentId = $userSegmentLocalService.getUserSegment($id).getAssetCategoryId())
        #foreach ($catId in $navItemCategoryIds)
           #if ($userSegmentId == $catId)
		#set ($hasUserSegment = true)
		#break
	   #else
	      #if ($userSegmentLocalService.fetchUserSegmentByAssetCategoryId($catId))
		#set ($hasUserSegment = false)
		#break
	      #else
		 #set ($ignoreCategory = true)
		 #break
	      #end
            #end
         #end
#end

The section where the navigation link is displayed. I wrap it within an if statement that checks for pages that have no userSegments or Categories, Pages that matches the current user’s usersegment’s assetCategoryId, and finally display pages that don’t have any userSegments but may have other categories assigned to the page.


#if ($navItemCategoryIds.size() == 0 || $hasUserSegment || $ignoreCategory)

Hope you enjoyed this post. Leave comments below if you have any questions.

The post How to use Liferay Audience Targeting to Control Navigation Elements appeared first on Xtivia.

The Business Case for Managing Liferay with Chef

$
0
0

Chef is a powerful automation platform that transforms complex infrastructure into code, bringing your servers and services to life. Whether you’re operating in the cloud, on-premises, or a hybrid, Chef automates how applications are configured, deployed, and managed across your network, no matter its size. Chef is built around simple concepts: achieving desired state, centralized modeling of IT infrastructure, and resource primitives that serve as building blocks. These concepts enable you to quickly manage any infrastructure with Chef. These very same concepts allow Chef to handle the most difficult infrastructure challenges on the planet. Anything that can run the chef-client can be managed by Chef.
More information can be found at the following website: https://docs.chef.io/chef_overview.html

Liferay’s out-of-the-box packaging provides a set of ready-made “bundles” which include a functional application server configured to work with the Liferay application.  While these “bundles” do make it convenient for an individual to quickly start up a local Liferay instance for experimentation, they make assumptions that directly conflict with long-terms maintainability and scalability.  As a result, Xtivia has devised an alternative deployment structure for all Liferay installations; this deployment structure is designed to provide the following benefits:

  1. Scalability for local application server instances
  2. Isolation of the Liferay application server from the main operating system
  3. Standardization of application server instance layout, to aid in automation and maintenance

 

Xtivia has created a set of Chef cookbooks which contain recipes that help with installing, configuring, and managing Liferay installations of any size. A few example scenarios and advantages provided by the Chef recipes are as follows:

  1. The Chef recipes created by Xtivia provide flexibility and functionality to configured the right kind of database, use the correct driver etc.,
  2. Chef cookbooks created by Xtivia also provide functionality to setup a clustered Liferay environment with a simple Chef  recipe.
  3. The Chef recipes created by Xtivia are managed using Berkshelf, and are fully compatible with AWS OpsWorks.
  4. You could crate a full stack Liferay environment using the Chef recipes created by Xtivia in a matter of minutes.
  5. Xtivia recipes can also be used to upgrade Liferay service packs.

 

In a later blog entry, we will discuss some of the various challenges of using Chef to install, configure, and manage Liferay and applications similar to Liferay. If you are interested in learning more about the details of using Chef (or any other configuration management tool) to automate the management of your Liferay environments, reach out to us today!

The post The Business Case for Managing Liferay with Chef appeared first on Xtivia.

Rock-Solid Liferay Plugin Deployments

$
0
0

This blog post describes the overall process recommended by Xtivia for deploying custom Liferay plugin applications to Liferay.  Note that while this document is primarily targeted at the deployment of custom Liferay plugins, many of the concepts do apply to non-Liferay web applications being deployed to the Tomcat application server.

Hot deployments in Liferay bundles

Liferay bundles normally have Apache Tomcat’s hot deployment feature enabled; while hot deployment is a convenient process that allows the deployed artifacts to be available immediately, there are a number of problems that arise when Tomcat hot deployment is enabled which will cause the system to become unstable over time.  As a result, Xtivia highly recommends disabling hot deployment at the application server level. Hot deployment is not recommended to be enabled on any environment except for lightweight testing purposes on local developer systems.

Problem Definition

The basic issues presented by the use of hot deployment on Apache Tomcat include the following:

  • Ongoing leaks in the JVM’s permanent generation memory space. Enabling hot deployments on a Tomcat application server instance will cause the amount of memory consumed by the permanent generation to increase over time, eventually resulting in an outage.  This is caused by the way that Tomcat handles web application deployments; the only remediation step is to restart the application server instance.
  • Tomcat has also been observed to have problems cleaning up the global classloader during deployments, which can lead to class loader errors within the JVM, again causing an outage.
  • The JVM will at times run into conflicts with the operating system when attempting to update filesystem content during a deployment; this can result in corrupted application deployments, as in-memory content fails to overwrite files which are marked as locked on the filesystem.  These issues are extremely difficult to detect or triage.
  • The preprocessing that Liferay does to any plugin is recommended to be executed at deploy time in order to minimize build overhead and avoid potential mismatches between the the target Liferay installation and the custom Liferay plugin.

Approach

The overall approach taken for persistent multi-user Apache Tomcat installations is to disable hot deployment at the application server level and include an application server restart in the application deployment process.  The specifics of implementation may vary, depending on the delivery toolchain in place, but the overall process is as follows:

  1. Disable Apache Tomcat’s hot deployment processor via modifications to the Tomcat configuration files.
  2. Modify Liferay’s deployment process to force it to deploy an atomic WAR file, rather than an exploded application tree.
  3. For each deployment, remove the target Tomcat instance from circulation during the deployment process.
  4. During a deployment execution, clear the directories that Tomcat uses to store copies of the applications at runtime.
  5. Execute validation of the deployment process prior to placing the Tomcat instance back in circulation.

Technical Details

Apache Tomcat Configuration

To turn off auto deployment in Tomcat, a configuration change needs to be made to Tomcat’s ${CATALINA_BASE}/conf/server.xml file.  An attribute named autoDeploy with a value of “false” needs to be added to the Host Entity nested within the Engine defined within the server.xml file. An example follows:

<Service name="Catalina">
    <Connector port="8009" protocol="AJP/1.3" redirectPort="8443" URIEncoding="UTF-8" />
    <Engine name="Catalina" defaultHost="localhost">
        <Host name="localhost" appBase="webapps" unpackWARs="true" autoDeploy="false">
        </Host>
    </Engine>
</Service>

Liferay Configuration

To prevent the Liferay deployment process from automatically expanding deployed plugins into the ${CATALINA_BASE}/webapps directory, the following configuration needs to be added to the Liferay portal-ext.properties file for the instance.

auto.deploy.unpack.war=false

The purpose of this change is to provide some differentiation between applications that have only been processed by Liferay and applications which have been fully deployed by an application server restart.

Standard Plugin Deployment Process

Before any deployments are performed on a Liferay instance, that instance should be taken out of circulation at the load balancer or web server level. This prevents traffic from reaching the instance on which deployments are being performed, eliminating the risk that end user requests will inadvertently be routed to the target server during the deployment process. This has the secondary benefit of providing a safe window after the deployment has been done during which deployment validation can occur.  Details on how to remove an individual Tomcat instance from circulation depend heavily on the load balancer or web server in use, and are beyond the scope of this document.

Once the application server has stopped receiving inbound traffic from the load balancers/web servers, the following steps should be executed:

  1. Move the target web application WAR file from the Tomcat ${CATALINA_BASE}/webapps directory to a temporary backup location.  If any step of the deployment process fails, restore to the previous state by copying this backed up WAR file back into the ${CATALINA_BASE}/webapps directory.
  2. Delete the directory containing the expanded version of the target application from the ${CATALINA_BASE}/webapps directory.
  3. Copy the plugin WAR that you want to be deployed to the Liferay deploy directory, located in ${liferay.home}/deploy.
  4. Verify that the operating system user that owns the Liferay process has full write permissions on the copied artifact.
  5. Wait for the new version of the application WAR to be available in the ${CATALINA_BASE}/webapps directory. Typically this is denoted by Liferay in the catalina.out and liferay-*.log files with the following message:
Deployment will start in a few seconds

At this point the Liferay plugin deployment processing activity is complete.

  1. Stop the Liferay Apache Tomcat application server process.
  2. Delete the contents of the ${CATALINA_BASE}/temp and ${CATALINA_BASE}/work directories.
  3. Start the Liferay Apache Tomcat application server process.
  4. Perform functional validation testing for each of the deployed applications.
  5. Remove the original target web application from the temporary backup location.

Once this process is complete, the Apache Tomcat server can be placed back into circulation at the load balancer or web server level.

Liferay EXT Plugin Deployment

An edge case for the Liferay plugin deployment process is present for Liferay EXT plugins.  These plugins actually modify the installed instance of the Liferay application itself, and they often include JAR files which need to be included in the application server’s global classpath; to accommodate these additional requirements, we recommend using a process similar to the one used to apply and deploy Liferay application patches/hotfixes.  A sample process for deploying EXT plugins is as follows:

On a clean bundle matching the version of Liferay that you intend to deploy, do the following:

  1. Unzip the target bundle into a temporary location.
  2. Apply all necessary hotfixes & patches for the target environment to the bundle’s Liferay instance.
  3. Start the bundle using the Tomcat startup.sh or startup.bat script.
  4. Deploy the EXT plugin to the temporary bundle by dropping it into the bundle’s deploy directory.
  5. Wait for the EXT plugin to be deployed by the temporary Liferay instance.
  6. Shut down and restart the temporary Liferay instance.
  7. Shut down the temporary Liferay instance.
  8. Bundle the contents of the ROOT application in the temporary Liferay Tomcat instance’s webapps directory into a ROOT.war file.  
  9. Copy the ROOT.war file along with all EXT-generated JAR files from the temporary Liferay Tomcat instance’s lib/ext directory into a central location for deployment.
  10. Remove the temporary Liferay instance.

On each of the application servers targeted for deployment, execute the following:

  1. As for standard plugins, first remove the target Tomcat instance from the load balancer or web server.  
  2. Stop the Liferay Apache Tomcat application server process.
  3. Move the following files from the Tomcat ${CATALINA_BASE}/webapps directory to a temporary backup location.  If any step of the deployment process fails, restore to the previous state by copying these backed up files back into the ${CATALINA_BASE}/webapps directory.
    1. ROOT.war
    2. The WAR file for the target application
  4. Move any JAR files which will be added by this EXT plugin from the Tomcat ${CATALINA_BASE}/lib directory to a temporary backup location.  If any step of the deployment process fails, restore to the previous state by copying these files back into the source directory.
  5. Delete the ROOT directory containing the expanded version of the Liferay Portal application from the ${CATALINA_BASE}/webapps directory.
  6. Delete the contents of the ${CATALINA_BASE}/temp and ${CATALINA_BASE}/work directories.
  7. Deploy the Liferay Portal application ROOT.war file to the target application server’s {CATALINA_BASE}/webapps directory.
  8. Deploy all JAR files gathered from the temporary Liferay bundle’s lib/ext directory to the target application server’s ${CATALINA_BASE}/lib directory.
  9. Start the Liferay Apache Tomcat application server process.
  10. Perform functional validation testing for each of the deployed applications.
  11. Remove the original ROOT.war and JAR files from the temporary backup location.
  12. Place the target server back into circulation at the load balancer or web server level.

The post Rock-Solid Liferay Plugin Deployments appeared first on Xtivia.


Rock-Solid Liferay Plugin Deployments

$
0
0

Liferay Plugin Deployment
This blog post describes the overall process recommended by Xtivia for deploying custom Liferay plugin applications to Liferay.  Note that while this document is primarily targeted at the deployment of custom Liferay plugins, many of the concepts do apply to non-Liferay web applications being deployed to the Tomcat application server.

Hot deployments in Liferay bundles

Liferay bundles normally have Apache Tomcat’s hot deployment feature enabled; while hot deployment is a convenient process that allows the deployed artifacts to be available immediately, there are a number of problems that arise when Tomcat hot deployment is enabled which will cause the system to become unstable over time. As a result, Xtivia highly recommends disabling hot deployment at the application server level. Hot deployment is not recommended to be enabled on any environment except for lightweight testing purposes on local developer systems.

Problem Definition

The basic issues presented by the use of hot deployment on Apache Tomcat include the following:

  • Ongoing leaks in the JVM’s permanent generation memory space. Enabling hot deployments on a Tomcat application server instance will cause the amount of memory consumed by the permanent generation to increase over time, eventually resulting in an outage. This is caused by the way that Tomcat handles web application deployments; the only remediation step is to restart the application server instance.
  • Tomcat has also been observed to have problems cleaning up the global classloader during deployments, which can lead to class loader errors within the JVM, again causing an outage.
  • The JVM will at times run into conflicts with the operating system when attempting to update filesystem content during a deployment; this can result in corrupted application deployments, as in-memory content fails to overwrite files which are marked as locked on the filesystem. These issues are extremely difficult to detect or triage.
  • The preprocessing that Liferay does to any plugin is recommended to be executed at deploy time in order to minimize build overhead and avoid potential mismatches between the the target Liferay installation and the custom Liferay plugin.

Approach

The overall approach taken for persistent multi-user Apache Tomcat installations is to disable hot deployment at the application server level and include an application server restart in the application deployment process. The specifics of implementation may vary, depending on the delivery toolchain in place, but the overall process is as follows:

  1. Disable Apache Tomcat’s hot deployment processor via modifications to the Tomcat configuration files.
  2. Modify Liferay’s deployment process to force it to deploy an atomic WAR file, rather than an exploded application tree.
  3. For each deployment, remove the target Tomcat instance from circulation during the deployment process.
  4. During a deployment execution, clear the directories that Tomcat uses to store copies of the applications at runtime.
  5. Execute validation of the deployment process prior to placing the Tomcat instance back in circulation.

Technical Details

Apache Tomcat Configuration

To turn off auto deployment in Tomcat, a configuration change needs to be made to Tomcat’s ${CATALINA_BASE}/conf/server.xml file. An attribute named autoDeploy with a value of “false” needs to be added to the Host Entity nested within the Engine defined within the server.xml file. An example follows:

<Service name="Catalina">
    <Connector port="8009" protocol="AJP/1.3" redirectPort="8443" URIEncoding="UTF-8" />
    <Engine name="Catalina" defaultHost="localhost">
        <Host name="localhost" appBase="webapps" unpackWARs="true" autoDeploy="false">
        </Host>
    </Engine>
</Service>

Liferay Configuration

To prevent the Liferay deployment process from automatically expanding deployed plugins into the ${CATALINA_BASE}/webapps directory, the following configuration needs to be added to the Liferay portal-ext.properties file for the instance.

auto.deploy.unpack.war=false

The purpose of this change is to provide some differentiation between applications that have only been processed by Liferay and applications which have been fully deployed by an application server restart.

Standard Plugin Deployment Process

Before any deployments are performed on a Liferay instance, that instance should be taken out of circulation at the load balancer or web server level. This prevents traffic from reaching the instance on which deployments are being performed, eliminating the risk that end user requests will inadvertently be routed to the target server during the deployment process. This has the secondary benefit of providing a safe window after the deployment has been done during which deployment validation can occur. Details on how to remove an individual Tomcat instance from circulation depend heavily on the load balancer or web server in use, and are beyond the scope of this document.

Once the application server has stopped receiving inbound traffic from the load balancers/web servers, the following steps should be executed:

  1. Move the target web application WAR file from the Tomcat ${CATALINA_BASE}/webapps directory to a temporary backup location. If any step of the deployment process fails, restore to the previous state by copying this backed up WAR file back into the ${CATALINA_BASE}/webapps directory.
  2. Delete the directory containing the expanded version of the target application from the ${CATALINA_BASE}/webapps directory.
  3. Copy the plugin WAR that you want to be deployed to the Liferay deploy directory, located in ${liferay.home}/deploy.
  4. Verify that the operating system user that owns the Liferay process has full write permissions on the copied artifact.
  5. Wait for the new version of the application WAR to be available in the ${CATALINA_BASE}/webapps directory. Typically this is denoted by Liferay in the catalina.out and liferay-*.log files with the following message:
Deployment will start in a few seconds

At this point the Liferay plugin deployment processing activity is complete.

  1. Stop the Liferay Apache Tomcat application server process.
  2. Delete the contents of the ${CATALINA_BASE}/temp and ${CATALINA_BASE}/work directories.
  3. Start the Liferay Apache Tomcat application server process.
  4. Perform functional validation testing for each of the deployed applications.
  5. Remove the original target web application from the temporary backup location.

Once this process is complete, the Apache Tomcat server can be placed back into circulation at the load balancer or web server level.

Liferay EXT Plugin Deployment

An edge case for the Liferay plugin deployment process is present for Liferay EXT plugins. These plugins actually modify the installed instance of the Liferay application itself, and they often include JAR files which need to be included in the application server’s global classpath; to accommodate these additional requirements, we recommend using a process similar to the one used to apply and deploy Liferay application patches/hotfixes. A sample process for deploying EXT plugins is as follows:

On a clean bundle matching the version of Liferay that you intend to deploy, do the following:

  1. Unzip the target bundle into a temporary location.
  2. Apply all necessary hotfixes & patches for the target environment to the bundle’s Liferay instance.
  3. Start the bundle using the Tomcat startup.sh or startup.bat script.
  4. Deploy the EXT plugin to the temporary bundle by dropping it into the bundle’s deploy directory.
  5. Wait for the EXT plugin to be deployed by the temporary Liferay instance.
  6. Shut down and restart the temporary Liferay instance.
  7. Shut down the temporary Liferay instance.
  8. Bundle the contents of the ROOT application in the temporary Liferay Tomcat instance’s webapps directory into a ROOT.war file.
  9. Copy the ROOT.war file along with all EXT-generated JAR files from the temporary Liferay Tomcat instance’s lib/ext directory into a central location for deployment.
  10. Remove the temporary Liferay instance.

On each of the application servers targeted for deployment, execute the following:

  1. As for standard plugins, first remove the target Tomcat instance from the load balancer or web server.
  2. Stop the Liferay Apache Tomcat application server process.
  3. Move the following files from the Tomcat ${CATALINA_BASE}/webapps directory to a temporary backup location. If any step of the deployment process fails, restore to the previous state by copying these backed up files back into the ${CATALINA_BASE}/webapps directory.
    1. ROOT.war
    2. The WAR file for the target application
  4. Move any JAR files which will be added by this EXT plugin from the Tomcat ${CATALINA_BASE}/lib directory to a temporary backup location. If any step of the deployment process fails, restore to the previous state by copying these files back into the source directory.
  5. Delete the ROOT directory containing the expanded version of the Liferay Portal application from the ${CATALINA_BASE}/webapps directory.
  6. Delete the contents of the ${CATALINA_BASE}/temp and ${CATALINA_BASE}/work directories.
  7. Deploy the Liferay Portal application ROOT.war file to the target application server’s {CATALINA_BASE}/webapps directory.
  8. Deploy all JAR files gathered from the temporary Liferay bundle’s lib/ext directory to the target application server’s ${CATALINA_BASE}/lib directory.
  9. Start the Liferay Apache Tomcat application server process.
  10. Perform functional validation testing for each of the deployed applications.
  11. Remove the original ROOT.war and JAR files from the temporary backup location.
  12. Place the target server back into circulation at the load balancer or web server level.

The post Rock-Solid Liferay Plugin Deployments appeared first on XTIVIA.

Do you want to integrate Jaspersoft Reports with Liferay?

$
0
0


Jaspersoft Reports and Analytics is a market-leading, enterprise open source Business Intelligence software that is used to bring timely, actionable data to the right people at the right time. And Liferay is a market leading Portal and Digital Experience Platform that is used to deliver amazing user experiences through multiple digital channels including web and mobile to customers, partners, and employees. Now, wouldn’t it make sense to deliver actionable business intelligence insights to your user base via your enterprise portal? This is where we see increasing interest in the marketplace to integrate Jaspersoft with Liferay, and this article speaks to an approach that we have used successfully to deliver a truly integrated, fully responsive User Experience on Liferay that delivers dashboards and reports leveraging your Jaspersoft investment.

Use Case
You wish to create an interactive dashboard and/or web reporting tool by creating reports in Jaspersoft and delivering these reports responsively through Liferay, and you wish to allow these reports to interact with each other – in other words, if you select a time slice or geographical region or product category in one report, you want the other reports on the page to refresh based on the event and the selection made.

Solution Overview
Jaspersoft provides a JavaScript library, Visualize.js that can help you pull Jaspersoft reports into Liferay – while integrating Visualize.js with Liferay can be somewhat tricky, you can use JQuery Events to make communication between reports simple.

Getting Started
To begin, we will use this demo and separate the two reports into two different portlets and show that we can use JQuery event for communication between them. To integrate with Liferay, just include the visualize.js and JQuery file into your theme and you’re set.

<script src="https://visualizejsdemo.jaspersoft.com/jasperserver-pro/client/visualize.js">
</script><script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.4/jquery.min.js" type="text/javascript"></script>

Initialize the two reports in two portlet instances. For the first portlet, initialize the report and use visualize.js click event to trigger a JQuery event.

v.report({
    resource: master,
    container: '#master',
    linkOptions: {
        events: {
            "click": function(evt, link) {
                var p = [link.parameters.store_state, link.parameters.total_sales];
                $('#master').trigger('master:click', p); ##trigger a jquery event to inform other reports
            }
        }
    },
    error: function(err) {
        console.log(err.message);
    }
});

And for the second portlet, initialize the report and attach a handler to the first report so that when the event triggers, it can update the second report accordingly.

var slaveReport = v.report({
    resource: slave,
    container: '#cities',
    params: {
        state: ['CA']
    },
    events: {
        reportCompleted: function(status, error) {
            if (status === "ready") {
                /*
                     Nothing to see here... but the report has finished.
                      */
            } else if (status === "failed") {
                error && alert(error);
            }
        }
    },
    error: function(err) {
        alert(err.message);
    }
});

## listens to the event triggered by the master click event and update the second report
$('#master').on("master:click", function(event, y, z) {
 parameters['state'] = [ y ];
 slaveReport.params(parameters).run();
});

At XTIVIA, we have developed a configurable Liferay portlet that lets you connect to Jaspersoft Reporting Server, and pull in relevant reports and also have experience of integrating Jaspersoft with Liferay from a security (authentication and authorization) perspective. I and the rest of the XTIVIA team look forward to sharing more about these tools and techniques in future blog posts.

For questions or immediate help with your Liferay/Jaspersoft initiative, please contact us at info@xtivia.com.

The post Do you want to integrate Jaspersoft Reports with Liferay? appeared first on XTIVIA.

Liferay DXP Audience Targeting 2.0 Overview

$
0
0

This article is a primer intended to give you a Liferay DXP Audience Targeting overview. You do not need to know much (or anything) about Liferay in order to understand the concepts laid out here.

You may have noticed personalized web experiences are everywhere. Once you browse Amazon for a kitchen gadget, the next thing you know you’re seeing ads for the SlapChop advertisement on Facebook! It’s amazing (and a little scary) that your actions or behaviors on one site influence the content you see somewhere else. But how could you apply this to your business on your own sites?

Personalized content delivery has traditionally required custom development until Liferay introduced the Audience Targeting application back in Q4 2014. However, with the release of Liferay DXP (What is a Digital Experience Platform?) and the latest version of the Audience Targeting application (v2.0), you can truly personalize content delivery and your end user experience with very little effort. Let’s go ahead and break down how it works.

Liferay DXP Audience Targeting Architecture

This diagram gives you a visual to go along with the Liferay DXP audience targeting overview. We will talk through the concepts, which applications are used to provide the end user experience, and wrap up with the flow you would need to work through to get something set up.

Liferay DXP Audience Targeting Concepts

There are three main concepts to go through here in order to fully grasp Liferay DXP audience targeting; user segments, rules, and campaigns.

User Segments

The overarching concept you need to understand about personalized content delivery is the “User Segment”. This is your audience. They are essentially a collection of users based on some rules. One easy way to grasp the concept is to think of a user segment like a sophisticated user group. When you define the user segment you also define some criteria to establish who is in the group. If a user meets the criteria, they’re in; if they don’t, they’re not.

Rules

The criteria on your user segments are a combination of rules Liferay DXP Audience Targeting - Mapping Users to Segmentswhich you define. Liferay DXP provides a myriad of items you can build these rules around. The user segment rules you can define with Liferay DXP out of the box can be classified into the two broad categories of user attributes and user behaviors. In this context, user attributes could also encapsulate data related to the user’s session or their social profiles.

Creating your user segments in Liferay DXP really gets interesting when you start creating your own custom rules (you have to write some code for these). These custom coded rules can integrate with whatever you want in order to meet the needs of your business. So, for example, if you want to build a user segment for your partners who have sold more than ‘n’ types of product, ‘x’ dollars in service contracts, or some combination thereof, you can!

Campaigns

Campaigns come into the picture when you are ready to start delivering content to your audience related to some event(s) that is going to occur. This could be just about anything; open enrollment at your company, a pending IPO, a product promotion, Christmas ads, etc. The key aspect of a campaign is it must have definite start and end dates.

You will essentially define a campaign, when it will run, and which audiences are targeted by the campaign. In the current release, campaigns surface content on-screen only. Liferay plans to introduce email campaigns in the near future.

Liferay DXP Audience Targeting Applications

When it comes to delivering content to your audiences, Liferay DXP provides three out-of-the-box applications to surface content. These applications are:

  • Campaign Content Display
  • User Segment Content Display
  • User Segment Content List

Each of these applications are used in different scenarios. Both the Campaign Content Display and User Segment Content Display applications will display a single piece of content on the screen. When the page loads, the user’s attributes and/or behavior are evaluated against the configuration to determine which content is the correct item to see. The logic in the configuration is effectively an IFTTT (If This Then That) flow.

The User Segment Content List application displays a dynamic listing of content. This application will query the system using a combination of the User’s information and the application configuration to display all matching content items.

Liferay DXP Audience Targeting Reports

Liferay DXP has some reporting built into the product which allows you to understand how users in your User Segments are using the system. The Summary Overview of your User Segment in Liferay will identify how many users are in the segment. Authenticated users (aka logged in users) also show up within the User Segment reporting pages.

Let’s say, for example, you were to break out your user segmentation by state. You could easily identify how many users have visited your website from each state without having to go outside the Liferay system to Google Analytics or another analytics tool. Taking this example a step further, knowing the specific users who have logged into your site and seen specific content / pages opens the door to other forms of campaigns, email for example, you may be conducting outside the Liferay system.

Within each User Segment, Liferay also provides a breakdown of the pages and content users have viewed throughout the site. This is an important distinction. Add-on analytics tools can only go so far without some special configuration and / or coding. Seeing what pages people visit is a standard data point, and has been since the beginning of the web. Seeing the specific content items a user viewed and what page they were on when they viewed it can be powerful.

Summary

Audience segmentation and content targeting are crucial elements of your digital strategy that bring several benefits including enhanced engagement with your user base, increased efficiency for your users as relevant content bubbles up to the top, improved conversion of prospects as they find more relevant offers / content, and improved overall user satisfaction.

With Liferay’s audience targeting solution, you have the tools you need to run through the build-measure-learn loop and optimize the user experience. Liferay audience targeting application suite ships with 1 administrative application, 3 display applications, 40 segmentation rules, 3 predefined reports, and 2 simulation options that get you going quickly. Additionally, this solution is fully modular (you can uninstall or replace components as needed) and extensible (SDK allows you to create new rules, metrics, and reports), and at the same time can integrate with other systems.

If you have questions on how you can best leverage audience targeting and / or need help with your Liferay DXP Implementation, please reach out to us.

 

Additional Reading

You can also continue to explore Liferay DXP by checking out The Top 10 New Features in Liferay DXP 7 and Top 10 Features of Liferay Forms from a functional perspective or Top 5 New Features in Liferay DXP UI Development and Creating JAX-RS REST Services in Liferay DXP from a development perspective.

If you liked what you read, please share this helpful post with your friends and co-workers on your favorite social media here:

The post Liferay DXP Audience Targeting 2.0 Overview appeared first on XTIVIA.

Using Jaspersoft PreAuth to Deliver Secure Reports on Liferay

$
0
0

We see increasing interest in the marketplace to integrate Jaspersoft with Liferay, and at XTIVIA, we have successfully implemented a truly integrated, fully responsive User Experience on Liferay that delivers dashboards and reports leveraging your Jaspersoft investment. This article describes an approach for achieving Liferay Jaspersoft security integration, using Jaspersoft PreAuth to deliver secure reports on Liferay.

Jaspersoft Report Server has multiple ways to authenticate using CAS, LDAP, or an external database. These authentication mechanisms are targeted towards achieving user authentication on the Jaspersoft server. However, this article explains the details of establishing a trust relationship with Jaspersoft server in order to render the reports on a Liferay Portal page using Visualize.js in a custom portlet.
Note: Not all use cases are created equal. This solution architecture provides a fine-grained way of achieving user data propagation from the main user repository (Liferay in this example) and Jaspersoft. PreAuth is one of many authentication mechanisms provided by Jaspersoft. If your use case is different than what is explained here, please review your authentication options as there may be a more appropriate alternative for your scenario.
 

For those of you familiar with Single Sign On (SSO) and SAML, you could draw the analogy of Liferay acting as the Identity Provider and Jaspersoft Report Server acting as the Service Provider in order to understand the relationships. The terms identity provider and service provider are more used in the context of SAML achieved SSO. However, we are not using SAML here. The rationale of using these terms is only to establish the context that Liferay is source repository of all the users, their roles, their organization hierarchy (if it exists), etc. Jaspersoft Report Server uses the user information provided by Liferay (in a secure way) to create/update the user details (including the roles and organization hierarchy) in Jaspersoft user DB.

Getting Started

Understanding Pre-Authentication

As the name suggests, this mechanism assumes the user is successfully authenticated on a trusted system (Liferay) and is trying to render the reports from a remote Jaspersoft Report Server instance using Visualize.js. PreAuth is designed to handle scenarios where the user repository is located in an external identity provider which acts as the user authentication system and holds the user account information and access rights (like user roles, organization hierarchy etc). This mechanism assumes the incoming user request to the Jaspersoft Report Server has all the required details of the user to successfully create the user account in Jaspersoft user repository.

Let’s delve into the details of the integrating Liferay with Jaspersoft Report Server to render the reports securely. In order to achieve the required level of security, we need to synchronize the user details into the Jaspersoft server. This article assumes Liferay Portal is used as user repository. Users will login to Liferay and navigate to the reports page where user details are securely sent to the Jaspersoft Report Server using Visualize.js. Upon receiving the user details, Jaspersoft Report Server will create/update the user details using the Pre-Authentication module. The diagram below depicts the high-level architecture of integrating Liferay Portal with Jaspersoft Report Server using Visualize.js and Jaspersoft’s PreAuth module to establish user identity on the Jaspersoft server.

Fig 1 Liferay Jaspersoft Security Integration

Figure 1: High-Level Architecture of Liferay - Jaspersoft PreAuth

Architecture

Let’s take a deep dive into the high-level architecture depicted in Figure 1. As already mentioned earlier, this high-level architecture assumes the users already exist on Liferay Portal and uses Liferay Portal to successfully authenticate. The high-level flow is as follows:

  • User successfully logs in to Liferay and is redirected to the Report rendering page. On the report rendering page, we will add the custom report portlet
  • The custom report portlet will render the user details, concatenate the user details as per the standards defined by the Jaspersoft PreAuth  modules, concat the current timestamp
  • Encrypt the user details and Base64 encode it. Send the encoded user details to the front end where Visualize.js will use it to connect to Jaspersoft and render the report
  • Once the Jaspersoft server receives the report request, it will check for the PreAuth request parameter. Once the PreAuth request token is identified, the configured PreAuth Spring module will come into play
  • The PreAuth module will decode and decrypt the user details string and check for the current timestamp to validate the token expiry date
  • Assuming the current timestamp is in the acceptable delta, parse the delimited user details and create or update the user in Jaspersoft as appropriate
  • Upon the user provisioning completion, Jaspersoft Report Server serves the requested report by applying the user authorization details. If the user is not authorized to view the report, an appropriate message will be sent back as the response to the request sent by the Visualize.js code snippet. If the user is authorized to view the report, Jaspersoft will send the requested report as the response

Now that we have seen the high-level flow of the PreAuth, let us try to delve into the details of the individual modules that participate in this mechanism. Let us start with the Liferay Container. It contains different modules that will help in creating a PreAuth token.
Note: Liferay Portal platform provides a variety of features to create multi-tenant enterprise applications. There are many cool features like CMS, advanced authorizations, workflows, digital asset management, etc. The depiction shown in Figure 2 only shows the modules of the Liferay Portal that are of interest in the context of this article. There are a lot of business/logical flows behind the modules which are not covered in this article since they are considered out of scope.

Fig 2 liferay-jaspersoft
Figure 2: Liferay Container showing the modules participating in PreAuth
  • User Authentication Module: This module takes the user’s credentials from the request, authenticates it using the configured Liferay authentication mechanism, and establishes a user session. If the user enters invalid credentials, the authentication will not succeed
  • Layout Renderer Module: Upon successful authentication, the Layout Renderer module will render the report page. The Layout renderer module will play the role of authorization. It will see whether the logged-in user has access to the requested page and redirect the user accordingly
  • Report Page: A page in Liferay is nothing but a Layout. The Layout renderer module renders the appropriate layout. The rendered layout will have a portlet or multiple instances of the portlet (this is a custom portlet that we developed) configured on the layout
  • Custom Report Portlet: Custom Report Portlet, as the name suggest, is a custom portlet developed by XTIVIA to render the Jaspersoft reports using Visualize.js library. This portlet will hold the logic of rendering the required user data that needs to be sent to Jaspersoft and converted into a token that will be sent to Jaspersoft via Visualize.js. Figure 3 depicts the flow for the token formation.

Note: The implementation details of the Custom Report Portlet are not covered in this blog article as it is considered out of scope for this context. XTIVIA will soon release this as a Marketplace application compatible with Liferay 6.2 EE as well as Liferay DXP.

Fig 3 liferay-jaspersoft
Figure 3: Flow for user data token creation on Liferay
  • Get the user details required by the Jaspersoft Report Server to create/update the user. The information includes, but not limited to, user screenName, emailAddress, firstName, lastName etc. As a good practice, always make sure you include the user UUID and use this as the unique identifier to identify the user on Jaspersoft
  • Based on the requirements, you can also fetch user roles, their organization hierarchy, user groups, etc
  • Once you have all the required user information, append the user information into a pre-defined (see callout #5) string format. The pre-defined string format will be dictated by the Jaspersoft Report Server. This can be any format that works best in your situation
  • Also append the user’s lastModifiedDate so that it can be used on the Jaspersoft side to ensure whether the user record needs an update or not
  • Get the current timestamp and append it to the user details string created in the previous step. The current timestamp is used to validate the expiry time of the token that will be sent to Jaspersoft. This is a crucial security step to ensure access to the Jaspersoft Report Server is restricted if the token were stolen and used to access reports directly from the Jaspersoft Report Server. By appending the timestamp, Jaspersoft will check the validity of the token by making sure the time difference between the appended timestamp and current time on Jaspersoft server when it receives the request is not more than the pre-configured delta value (This can be something like 60 seconds or 10 minutes or whatever that works for your scenario)
  • The next step is to encrypt the user details string (with current timestamp appended) using an encryption algorithm like AES 128 or any other algorithm that is approved by your company standards. Importantly, make sure you are using a 2-way encrypt/decrypt supported algorithm as this will be decrypted on the Jaspersoft side. Why encrypt?
    • Visualize.js sends the user token via http(s) to the Jaspersoft server (it is strongly recommended to connect to Jaspersoft server using SSL to avoid token interception). Since the user details are sent via wire, the chances of a middle man attack is very high and the user details can be hijacked. The encryption helps in encrypting the user details and prevents the hijacker from tampering with the token>
  • Base64 encode the encrypted user details and hand it over to the Visualize.js script so it can securely render the reports from the Jaspersoft server

Pre-Defined String: The pre-defined string should comply with the format that Jaspersoft defines as Jaspersoft uses this format to delimit the user details and save it to the user DB. For an example, the concat user details can be in the format of

u=jdoe|r=Manager,Organizer|o=Xtivia-Austin-Texas-USA,o=Xtivia-Colorado Springs-Colorado-USA |firstName=john|lastName=Doe|emailAddress=johndoe@xtivia.com|uuid=123e4567-e89b-12d3-a456-426655440000|lastModifiedTimestamp=1469674564%currentTimestamp=2499674564

If you carefully observe the currentTimeStamp, which is appended at the end, I have % as the delimiter instead of | so that it currentTimeStamp can be easily identified from rest of the user details
 

To this point we have mostly focused on the role of Liferay and building the PreAuth token that will be transferred to Jaspersoft Report Server as part of a request for a report. The next section explains about how Jaspersoft processes the request and uses the PreAuth module to provision the user(s).

The Jaspersoft Report Server interprets the incoming report request, looks for the PreAuth request parameter, and if it exists will invoke the PreAuth Spring module configured to process the PreAuth tokens to provision the user details. Let’s take a closer look at the process flow shown in Figure 4.

Fig 4 liferay-jaspersoft

Figure 4: Jaspersoft Pre-Authentication Token Processing Flow
  • Jaspersoft receives the request and looks for PreAuth request parameter. If it exists, the PreAuth Spring module will come into play and handles the user provision and authorization
  • Upon receiving the token, the custom created decrypt class will decode and decrypt the token
  • Then it will retrieve the token validity timestamp from the decrypted user details string and compare it with the current timestamp. If the delta is beyond the pre-configured value (which can be 60 seconds or 10 minutes or whatever that works for your scenario), then the request will not be served and Jaspersoft server will send an error response. If the delta is within the pre-configured value, then the flow proceeds to the user provisioning module
  • In the user provisioning module, the user details string will be delimited based on the Pre-Defined format as noted above. Next, it will verify whether the user record exists in Jaspersoft user DB. If no, then it will create a new user record and assign the user to the roles/organization hierarchy. If yes, then it will verify whether the incoming user details lastModifiedTimestamp is greater than the saved user lastModifiedTimestamp. If it is greater, then update the user records with the new values or else skip user update and process the authorization details and serve the report

Enabling Pre-Authentication on Jaspersoft

Jaspersoft uses Spring security to achieve the user authentication. Configuring PreAuth is very straightforward. Jaspersoft, as part of its distribution, provided a variety of sample configuration files which are used to enable additional modules. These files are located in the <js‑install>/samples/ folder. Under the samples folder, you will see a findexternalAuth-sample-config folder which holds different flavors of externalAuth files to enable external authentication. The file of our interest is sample-applicationContext-externalAuth-preAuth-mt.xml. This XML file is a spring context XML file and will be used to enable PreAuth via external authentication.

To enable the PreAuth, take a copy of sample-applicationContext-externalAuth-preAuth-mt.xml and rename it as applicationContext-externalAuth-preAuth-mt.xml. The Jaspersoft community has provided a beautiful explanation of this file. Going through the bean definitions from this file is almost duplicating the effort. To get a good understanding of this file, please read the Jaspersoft Community documentation. However, I’ll explain some of the important bean definitions that will participate in the token processing.

[code lang=”js”]<bean id="proxyPreAuthenticatedProcessingFilter" class="com.jaspersoft.jasperserver.api.security.externalAuth.preauth.BasePreAuthenticatedProcessingFilter">

<!– request parameter containing pre-authenticated token with user info –>
<property name="principalParameter" value="pp"/>

<!– tokenInRequestParam=false – principalParameter is read from header only. tokenInRequestParam=true – principalParameter is read from request url parameters only. If tokenInRequestParam is not specified, the authentication token is looked up in request header and, then, if not found, in request params –>
<property name="tokenInRequestParam" value="true"/>

<!– Works with the plain-text tokens by default –>
<!– Substitute with your token decryption implementation of com.jaspersoft.jasperserver.api.common.crypto.CipherI interface –>
<property name="tokenDecryptor">
<bean class="com.jaspersoft.jasperserver.api.common.crypto.PlainTextNonCipher"/>
</property>
<property name="externalDataSynchronizer">
<ref local="externalDataSynchronizer"/>
</property>
<property name="authenticationManager">
<ref local="preAuthenticatedManager"/>
</property>
<property name="authenticationDetailsSource">
<bean class="com.jaspersoft.jasperserver.api.security.externalAuth.wrappers.spring.JSAuthenticationDetailsSourceImpl">
<property name="clazz">
<value>com.jaspersoft.jasperserver.api.security.externalAuth.preauth.BasePreAuthenticatedGrantesAuthorityDetails</value>
</property>
</bean>
</property>
<property name="jsonRedirectUrl" ref="authSuccessJsonRedirectUrl"/>
</bean>[/code]

proxyPreAuthenticatedProcessingFilter: This Spring bean is used to enable PreAuth. This bean causes the Spring Security filter chain to handle the authentication processing via proxy bean instead of using the default internal filter chain.

  • <property name=”principalParameter” value=”pp”/>: This property helps in triggering the token based authentication. principalParameter is expected to be present in the incoming request. The value of the key should be “pp”. Meaning, the request should have pp=<token> either in the request header or in the request parameter. You could use another value instead of “pp” with the value of your choice.
  • <property name=”tokenInRequestParam” value=”true”/>: This property acts as a flag to locate the token in the request. A value of true indicates the token is coming in as a request parameter, whereas a value of false indicates the token is coming as a request header. Absence of this property will cause the code to look first in request header and then in request parameters
  • <property name=”tokenDecryptor”> <bean class=”com.xtivia.jaspersoft.cipher.CustomCipher”/> </property>: tokenDecryptor aids in decrypting the token. Replace the bean class with your custom implementation. The custom implementation should implement Jaspersoft’s decrypt algorithm interface CipherI. com.xtivia.jaspersoft.cipher.CustomCipher is a custom implementation that we created for our use case. The following code snippet shows a sample decrypt method implementation (the following snippet does not show the complete implementation of token decryption. It only shows a part of the implementation. Please make sure you are using the same algorithm that you used to encrypt on the Liferay side)

[code lang=”js”]public class CustomCipher implements CipherI {

private static final Log logger = LogFactory.getLog(CustomCipher.class);

@Override
public String decrypt(String text) {
if(text != null){
try {
logger.info("string received: ["+text+"]");
final SecretKeySpec key = generateKey(password);

byte[] decodedCipherText = Base64.decodeBase64(text);

byte[] decryptedBytes = decrypt(key, ivBytes, decodedCipherText);
String message = new String(decryptedBytes, CHARSET);
return message;
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
} catch (NoSuchAlgorithmException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (GeneralSecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return text;
}
return null;
}
}[/code]

  • <property name=”authenticationManager”><ref local=”preAuthenticatedManager”/></property>: The bean provides the facility to process the authentication token. This property references to preAuthenticatedManager bean. The preAuthenticatedManager bean definitions looks as follows

[code lang=”js”]<bean id="preAuthenticatedManager" class="com.jaspersoft.jasperserver.api.security.externalAuth.wrappers.spring.JSProviderManager">
<property name="providers">
<list>
<!– This bean calls upon preAuthenticatedUserDetailsService to create user details based on a token extracted from request by proxyPreAuthenticatedProcessingFilter –>
<bean class="com.jaspersoft.jasperserver.api.security.externalAuth.wrappers.spring.preauth.JSPreAuthenticatedAuthenticationProvider">
<property name="preAuthenticatedUserDetailsService">
<bean class="com.jaspersoft.jasperserver.multipleTenancy.security.externalAuth.preauth.MTJSPreAuthenticatedUserDetailsService">
<!– Token format configuration example for token: u=obama|r=PRESIDENT,HUSBAND|o=WhiteHouse|pa1=USA,Kenya|pa2=Washington –>
<property name="tokenPairSeparator" value="|"/>
<property name="tokenFormatMapping">

<map>
<entry key="username" value="u" />
<entry key="roles" value="r" />
<entry key="orgId" value="o" />
<entry key="expireTime" value="exp" />
<entry key="profile.attribs" >

<map>
<entry key="profileAttrib1" value="pa1" />
<entry key="profileAttrib2" value="pa2" />
</map>

</entry>
</map>

</property>
<property name="tokenExpireTimestampFormat" value="yyyyMMddHHmmssZ"/>
</bean>
</property>
</bean>
</list>
</property>
</bean>[/code]

  • The com.jaspersoft.jasperserver.api.security.externalAuth.wrappers.spring.preauth.JSPreAuthenticatedAuthenticationProvider dictates the token format, the delimiter and the mapping of the user variables. As per the above example, tokenPairSeparator is the pipe (|). However, you can change it to any other character. Please make sure that this is the delimiter that you use to construct the user details on the Liferay Portal. You can also update the tokenFormatMapping to meet your requirements.

Once you make the required changes to the applicationContext-externalAuth-preAuth-mt.xml file, please move the file into the <JS Server Install Location>/apache-tomcat/webapps/jasperserver-pro/WEB-INF folder and restart Jaspersoft server. Assuming all the changes in the applicationContext-externalAuth-preAuth-mt.xml file are made appropriately, after the server restart the external token authentication is ready to use.

The next step is to send the encrypted user details token created by the custom reporting portlet as the PreAuth principalParameter parameter (pp as per the configuration explained above) using the Visualize.js. Decryption, user provisioning, authorization and report serving will be taken care by the Jaspersoft Report Server.

Additional Reading

You can read more about integrating Jaspersoft reports with Liferay in another XTIVIA blog post titled “Do you want to integrate Jaspersoft Reports with Liferay?” on the XTIVIA site at https://www.xtivia.com/integrate-jaspersoft-reports-with-liferay/

The post Using Jaspersoft PreAuth to Deliver Secure Reports on Liferay appeared first on XTIVIA.

Top 5 DevOps Features in Liferay DXP

$
0
0

Liferay DXP is getting a lot of attention as of late, with articles popping up detailing how the platform has substantially improved usability and extensibility for developers and end-users…but what about the operational aspect? There’s good news on that front as well; Liferay has made some substantial changes in the platform that will directly improve quality-of-life for Liferay DevOps folks.

1. Liferay Configuration Updates

A major change in Liferay DXP is the method by which configuration is applied and stored in the DXP platform. Historically, there have been two places for Liferay configuration to live; on the filesystem in one of the Liferay properties files, or in XML blobs stored in the database. The interaction between those two configuration sets was not always clear-cut, and there was no “standard” way to verify configuration or migrate it from environment to environment.

With Liferay DXP, this is changing; the old properties files are still in use in a limited capacity, but the old database-storage mechanism using an XML internal configuration representation has been eliminated in favor of a new process, leveraging the standard OSGi framework for modular Java applications. Configuration data is entered into the DXP interface as needed, and the DXP UI provides an export mechanism to allow the user to directly export the configuration set once they’re satisfied with the configuration changes. The exported configuration can then be imported directly into another Liferay DXP environment; this allows configuration to be easily promoted from one DXP environment to another while at the same time preserving an easy-to-use experience for end users to customize the platform.

2. OSGi Shell Features

As anyone who is a Liferay follower will know, Liferay, Inc. has been a vocal supporter of the OSGi initiative and has touted the fact that one of the major evolutionary changes in DXP is that it heavily leverages the OSGi framework to provide modularity and extensibility. While this is primarily a development-oriented change, the inclusion of the Apache Felix OSGi implementation apache-felix-logointo Liferay DXP has some profound effects on operations-oriented work as well. While the inclusion of Felix into the DXP runtime has many benefits, there is one specific feature of particular interest: the addition of the Gogo shell interface. The Gogo shell is Felix’s implementation of RFC 147, which defines a standard command interface for OSGi modules. The full set of capabilities provided by the Gogo shell is beyond the scope of this article; however, examples of what can be done via the shell interface include:

  • Viewing and setting module or system properties
  • Viewing the current lifecycle state of a module
  • Diagnosing a module which is currently in a bad state
  • Enabling or disabling a module

This inclusion of an interactive interface to the DXP platform adds a powerful new tool to both deployment and troubleshooting activities for both operations and development teams.

3. Changes to DXP Search

Another exciting change to come to Liferay DXP is a complete reworking of the platform’s search component to vastly improve flexibility and scalability. Liferay Portal version 6.2 and earlier used an in-process Apache Lucene engine to handle all portal search requests, and while this architecture is a tried and true way to include search capabilities into the application, it presented performance and scaling problems, especially in mid-size to large implementations. To remedy this, and to future-proof the DXP platform, Liferay has completely abstracted the search implementation into an OSGi-based application, and the platform now leverages Elasticsearch to handle search indexing and queries out-of-the-box.

elasticsearch-logoElasticsearch is a full-text search engine based on the Lucene library built on a distributed architecture which provides robust scaling and high-availability functionality through a REST-based interface. Packaging the search function into a discrete component utilizing an enterprise-class open-source engine significantly improves the ability of the DXP search function to operate in large-scale, mission-critical applications, while at the same time bringing major feature improvements for smaller applications leveraging DXP.

It is important to note that DXP also ships with the ability to swap out the default Elasticsearch implementation for an Apache Solr implementation, if needed; Elasticsearch and Solr provide a very similar feature set, and either engine can be used with no degradation in the functionality or performance provided by the DXP search function.

4. Changes to the Operational Runtime

One of the most basic changes made in Liferay DXP is that the platform now ships with Java 8 java8-logoas its default target JVM runtime. This is a watershed change, allowing all of the benefits of Java 8 to finally be reliably used in a Liferay environment; in particular, this allows folks on the operations side to standardize on the G1GC collector algorithm for JVM garbage collection. A version of G1GC was available in later releases of Java 7, but it lacked a number of optimization features present in Java 8. The direct impact of using Java 8 and G1GC is lower overhead for the Java process, improved ergonomics for Java garbage collection, reduced pause times, and overall improved performance for the portal process.

5. Updated Build Processes

With the update to DXP, Liferay introduced a new set of build tools to enable faster and more reliable build automation to be put into place. Prior to DXP, builds utilized either the Ant build tool (by default) or the Apache Maven build tool; both of these tools served their purpose well, but have been showing their age in the past couple of years.

ApacheAnt-Maven-Gradle-Gulp

With Liferay DXP, the standard build tools are now Gradle (for Liferay/OSGi modules) and Gulp (for Liferay themes). This represents a shift in Liferay’s development tooling which will make it much easier for developers to create consistent, high-quality code for deployment into a Liferay environment with a minimum of boilerplate code authoring. Additionally, this shift mimics a shift in the overall development community; both Gradle and Gulp have seen increased adoption among the Java and JavaScript developer communities (respectively) over the past few years.

Operationally, the shift to Gradle and Gulp will simplify and improve continuous integration and delivery pipelines, and will make it much easier to build consistent processes targeted at build and release automation without requiring extensive customization of the build process or environment.

Additional Changes

A few additional changes which, while important, didn’t make the Top 5 list:

  • The default application server used by Liferay DXP is now Apache Tomcat version 8, which represents a significant upgrade from version 7.
  • The clustering component in Liferay DXP has been redesigned and, as with many of the other internal features of the platform, has been reimplemented as an OSGi-based module.

All in all, there are noteworthy changes in Liferay DXP 7, and if you are a DevOps engineer, you should certainly dig into them.

If you need assistance with your Liferay DevOps strategy and implementation or just need some advice, do reach out to us.

 

Additional Reading

You can also continue to explore Liferay DXP (What is a Digital Experience Platform?) by checking out The Top 10 New Features in Liferay DXP 7 and Liferay DXP Audience Targeting 2.0 Overview from a functional perspective or Top 5 New Features in Liferay DXP UI Development and Creating JAX-RS REST Services in Liferay DXP from a development perspective.

If you liked what you read, please share this helpful post with your friends and co-workers on your favorite social media here:

The post Top 5 DevOps Features in Liferay DXP appeared first on XTIVIA.

Viewing all 53 articles
Browse latest View live